GSA integration for external indexing and searching
Project description
Introduction
============
Package collective.gsa integrates Plone site with a Google Search Appliance (GSA). It
provides an indexing processor with collective.indexing as well as a search capabilities.
Installation
===========
Add collective.gsa to your buildout.cfg to both eggs and zcml section::
[buildout]
eggs = collective.gsa
[instance]
zcml =
collective.gsa
collective.gsa-overrides
After running
buildout and restarting the server, you can install it via Quick Installer either ZMI or
Plone Add/Remove Products. After installing the package the GSA settings and GSA maintenance configlets
will appear in the Plone Control Panel. Follow the fields' description to set it up.
Indexing
========
Package collective.gsa registers adapter for IQueueIndexProcessor and indexing is done via collective.indexing
package. When object is reindexed the content provider adapter is called to obtain the data.
The package contains content providers for objects implementing IATDocument, IATFile and IATContentType.
* For document CTs (Page, News Items etc.) the main macro ( usually the site without portlets and the header).
* For file CTs the primary file field is sent.
* For other archetype based CTs the title and description.
To create support for other types just create your own content provider implementing interface IContentProvider
and register it via zcml. For details look at the content_provider module and gsa's configure.zcml
The package supports dual indexing if you have two sites - e.g. secure for edit access and public for anonymous access.
The object's identifier in GSA is its url which is obtained using object's absolute_url method.
Thus all the indexing has to be done from the url you want it to be indexed for ( e.i. not from localhost).
In the GSA's control panel you can set a dual base url for anonymous site. Then the url is constructed using the dual url plus
absolute_url_path method.
When reindexing object, the feed id added to a persistent queue and is removed when successfully sent to GSA hence if GSA is
unreachable the feed will be send when another object is reindexed.
Fact that GSA received the feed does not mean that it is going to be indexed ( e.i. the url is not in the Matched URLs settings )
If your objects are not indexed, please, check the GSA's Crawl and Index settings.
Searching
=========
This package replaces the search template and livesearch script to use GSA as a search engine. This is done by adding a gsasearch=on
into the search request to avoid using GSA search for internal searches ( such as navigation, folder contents etc. )
The plone's advanced search is at the default search_form template and does not use GSA at all, because GSA does not handle
indexes as zope's ZCatalog does. However you can use the GSA's advanced search which url you can set at the local GSA control panel.
Current Status
==============
The basic implementation is nearly finished and we aim to write the neccessary tests for it.
Credit
======
This code was inspired by collective.solr package and it was kindly sponsored by
University of Leicester.
Changelog
=========
1.0 - Initial release
---------------------
* Initial release
============
Package collective.gsa integrates Plone site with a Google Search Appliance (GSA). It
provides an indexing processor with collective.indexing as well as a search capabilities.
Installation
===========
Add collective.gsa to your buildout.cfg to both eggs and zcml section::
[buildout]
eggs = collective.gsa
[instance]
zcml =
collective.gsa
collective.gsa-overrides
After running
buildout and restarting the server, you can install it via Quick Installer either ZMI or
Plone Add/Remove Products. After installing the package the GSA settings and GSA maintenance configlets
will appear in the Plone Control Panel. Follow the fields' description to set it up.
Indexing
========
Package collective.gsa registers adapter for IQueueIndexProcessor and indexing is done via collective.indexing
package. When object is reindexed the content provider adapter is called to obtain the data.
The package contains content providers for objects implementing IATDocument, IATFile and IATContentType.
* For document CTs (Page, News Items etc.) the main macro ( usually the site without portlets and the header).
* For file CTs the primary file field is sent.
* For other archetype based CTs the title and description.
To create support for other types just create your own content provider implementing interface IContentProvider
and register it via zcml. For details look at the content_provider module and gsa's configure.zcml
The package supports dual indexing if you have two sites - e.g. secure for edit access and public for anonymous access.
The object's identifier in GSA is its url which is obtained using object's absolute_url method.
Thus all the indexing has to be done from the url you want it to be indexed for ( e.i. not from localhost).
In the GSA's control panel you can set a dual base url for anonymous site. Then the url is constructed using the dual url plus
absolute_url_path method.
When reindexing object, the feed id added to a persistent queue and is removed when successfully sent to GSA hence if GSA is
unreachable the feed will be send when another object is reindexed.
Fact that GSA received the feed does not mean that it is going to be indexed ( e.i. the url is not in the Matched URLs settings )
If your objects are not indexed, please, check the GSA's Crawl and Index settings.
Searching
=========
This package replaces the search template and livesearch script to use GSA as a search engine. This is done by adding a gsasearch=on
into the search request to avoid using GSA search for internal searches ( such as navigation, folder contents etc. )
The plone's advanced search is at the default search_form template and does not use GSA at all, because GSA does not handle
indexes as zope's ZCatalog does. However you can use the GSA's advanced search which url you can set at the local GSA control panel.
Current Status
==============
The basic implementation is nearly finished and we aim to write the neccessary tests for it.
Credit
======
This code was inspired by collective.solr package and it was kindly sponsored by
University of Leicester.
Changelog
=========
1.0 - Initial release
---------------------
* Initial release
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
collective.gsa-1.0.tar.gz
(29.8 kB
view details)
Built Distributions
collective.gsa-1.0-py2.5.egg
(79.7 kB
view details)
collective.gsa-1.0-py2.4.egg
(80.6 kB
view details)
File details
Details for the file collective.gsa-1.0.tar.gz
.
File metadata
- Download URL: collective.gsa-1.0.tar.gz
- Upload date:
- Size: 29.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 189986a911c1bdbcbcebf51c363bc0d4f0e7df715fdd1748d74f2b4bbb963f60 |
|
MD5 | 00c1bb213034809ce3b12a364bcea0c8 |
|
BLAKE2b-256 | 9b4381bf063bf7ca0ce625432b973806ece9ea0d86d4282b4fd0cd7ea946ebc0 |
File details
Details for the file collective.gsa-1.0-py2.5.egg
.
File metadata
- Download URL: collective.gsa-1.0-py2.5.egg
- Upload date:
- Size: 79.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 916ad49adcb11cf05f873d2338dcfda5da426c905850086555295b42b439492c |
|
MD5 | cf99d0ae0f69c528e87fc16f218ea732 |
|
BLAKE2b-256 | e8676092bded3a03a148cacd829d0e7bad4275c010954724d69df85f3d01bea6 |
File details
Details for the file collective.gsa-1.0-py2.4.egg
.
File metadata
- Download URL: collective.gsa-1.0-py2.4.egg
- Upload date:
- Size: 80.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ee213573f4ff2b6cf1b567ae07a2c3c32a7e672829ba2f88ae7b02ed97e06dfe |
|
MD5 | 8509cf127af6788db229f8242df4c211 |
|
BLAKE2b-256 | 01692c335dc8a2839fd7a59f68bb7a18e3ef15093eaf0baa010aad813f656461 |