Welcome

Alfresco – microservices integrations

Alfresco content has its metadata and audit trail, which can be of a great value – if you could read, structure and interpret it.

For quite a while I was wondering how to integrate custom microservices with Alfresco but without custom AMPs or custom code running inside the repository. I knew that SOLR tracks Alfresco transactions and that there is a SOLR API on the repository side but it is not part of the public API and there is no much documentation about it. However, once you start digging, it becomes obvious that the SOLR APIs could be used by other applications too and not only by Solr for content or metadata indexing. With that statement in mind, it seems reasonable to try using those rest services in order to write “microservice” sort of integrations with the Alfresco repository. Although there are features that cannot be done like that (just think of transactional behaviors) but if one is happy with “eventual consistency” than this approach could be of benefit.

The next step was to create a reusable “component” that could serve as a building block for a Spring Boot microservice (and it was published as an opensource project on github at https://github.com/PleoSoft/peltas-core ).

The initial idea was to only provide BI insights over Alfresco content metadata (or Audit data in a commercial version) and since then it has evolved in order to be able to export the same data not only to a database, but also to any kind of storage you need. An example is a current project where we are using Peltas to sync Alfresco folders into a GIT repository (this is very useful for textual formats).

Peltas comes with an evaluator, very similar to what Alfresco Share has in order to evaluate if a node is of interest and proceed to the rest of processing. It also uses Spring Batch and Spring Integration in order to have transactionnality and to remember the last processed node in order to be able to restart its job from the correct place.

Here is an extract of some simple evaluator configuration

peltas.handler.documentupdated.evaluator=/alfresco-workspace/transaction/action=NODE-UPDATED

or a bit more complex with specifying the content type

peltas.handler.documentupdated.evaluator=/alfresco-workspace/transaction/action=NODE-UPDATED|/alfresco-workspace/transaction/type=cm:content

property data mapping and conversions are also supported

peltas.handler.documentupdated.mapper.property.created.data=/alfresco-workspace/transaction/properties/add@{http://www.alfresco.org/model/content/1.0}created

peltas.handler.documentupdated.mapper.property.created.type=java.util.Date

for a folder evaluator we would use this kind of evaluator

peltas.handler.folderupdated.evaluator=/alfresco-workspace/transaction/action=NODE-UPDATED|/alfresco-workspace/transaction/type=cm:folder

You will notice that the samples above refer to /alfresco/workspace/transaction/… this is simply because Peltas started as a parser for Alfresco Audit data and kept the same format for Alfresco workspace (or live) nodes.

Here are some resources about the Peltas project:

Since the community version (https://github.com/PleoSoft/peltas-community) provides an export to a predefined database schema, you could also expose the data to a custom db schema without coding but just with configuration or to any other storage if you write some Java code. The commercial enterprise version provides a couple of existing connectors like JMS (i.e. ActiveMQ) or Solr/Elasticsearch but also a connector for Alfresco AUDIT applications where one could export Audit data in a fashion suitable for any BI tools.

Unfortunately, in a container installation, the Alfresco team decided to restrict access to the SOLR APIs and thus if you want to use Peltas with Alfresco and docker, you either can change their nginx configuration (hard part) or just include the Peltas image in the same docker network or by following https://github.com/PleoSoft/peltas-community/blob/master/README.md#run-with-docker

Leave a Reply

Your email address will not be published. Required fields are marked *