Development with transmogrify.dexterity
Transmogrifier is generally used for migrating content between Plone versions. However, we've found that export / import ability for Dexterity content really helps development too. We can:-
- Store some baseline content along with the sites custom product, which means you always have a representative set of content to develop and run integration tests against.
- Transfer content from different instances, e.g. pulling content from a shared development instance to your local instance for further investigation.
- Modify content with UNIX tools, when site structures change for example.
Using transmogrify.dexterity, we can ilmport any content from dexterity to & from a directory tree of JSON files.
We'll assume that you've already got a Plone 4.1 buildout set up, and have a custom product for your site that contains your dexterity types. Follow a guide such as this to get this going.
For this example I'll occasionally mention a shuttlethread.farmyard product, which is the custom product for a dummy site to manage our animals on the farm. The source is available on bitbucket if you'd rather read the entire thing.
Enabling transmogrify.dexterity content export/import
Firstly we need to include transmogrifier and transmogrify.dexterity. We can do this in shuttlethread.farmyard's setup.py:-
install_requires = [ . . . 'quintagroup.transmogrifier', 'transmogrify.dexterity',
We also need to define a pipeline for quintagroup.transmogrifier to use. There is an example pipeline configuration within transmogrify.dexterity that should be good enough for us:-
<configure xmlns="http://namespaces.zope.org/zope" i18n_domain="shuttlethread.farmyard"> <include package="transmogrify.dexterity.pipelines" file="files.zcml" /> </configure>
If you needed to change it, you could use the import.cfg and export.cfg files.zcml includes as a starting point for your own.
Now start up your Plone instance with some dexterity content and go to http://localhost:8080/Plone/portal_setup/manage_exportSteps. There should be a "Content (transmogrifier)" option, select that and "Export selected steps". We get a .tar.gz (or .zip under windows) with contents similar to the following:-
structure/.objects.xml structure/meadow/ structure/meadow/_content.json structure/meadow/.objects.xml structure/meadow/daisy/ structure/meadow/daisy/_content.json structure/meadow/daisy/_field_feeding_notes.htm structure/meadow/freda/ structure/meadow/freda/_content.json structure/meadow/freda/_field_feeding_notes.htm
Firstly, each page has it's own directory, and contains a _content.json file. This contains all simple fields in our dexterity type. For example, here is the content of Daisy's _content.json:-
{ "creators": ["admin"], "feeding_notes": { "contenttype": "text/html", "encoding": "utf-8", "file": "_field_feeding_notes.htm" }, "species": "cow", "title": "Daisy" }
"Feeding notes" is a rich text field. The values for these are put in a separate file and the JSON contains a reference to the file name. The same would happen with an image field or any other BLOB field. Finally, the meadow is a container type so it has a .objects.xml which lists the cows within it.
There is also an equivalent import option, but we're going to do something a bit neater.
Storing content in a GenericSetup profile
We can use a GenericSetup profile to import content automatically when we create a new site, and keep this in our custom product. First, define we the "testfixture" profile in shuttlethread.farmyard's configure.zcml:-
<genericsetup:registerProfile name="testfixture" title="unittest content" directory="profiles/testfixture" description="unittest content / example content for development" provides="Products.GenericSetup.interfaces.EXTENSION" />
Then create the profile directory (i.e. "profiles/testfixture"). It makes sense to depend on the default profile, assuming that's where the types for the content we create are. So in metadata.xml:-
<metadata><version>1</version><dependencies><dependency>profile-shuttlethread.farmyard:default</dependency></dependencies></metadata>
And create a file quintagroup.transmogrifier-import.txt with the word "import". This file is just a marker to show the import happened, it's content is not important.
Finally, unzip/tar the export made earlier so at the end we have:-
src/shuttlethread/farmyard/profiles/testfixture/metadata.xml src/shuttlethread/farmyard/profiles/testfixture/quintagroup.transmogrifier-import.txt src/shuttlethread/farmyard/profiles/testfixture/structure/.objects.xml src/shuttlethread/farmyard/profiles/testfixture/structure/meadow/.objects.xml src/shuttlethread/farmyard/profiles/testfixture/structure/meadow/_content.json src/shuttlethread/farmyard/profiles/testfixture/structure/meadow/daisy/_content.json . . .
Now we can instantly build a new site full of content. Going to @@plone-addsite?advanced=1 and choose the new "unittest content" option.
Also note that we didn't have to export the content from the same server, it could have been a staging server or production server. Run the export process on your production server, put the content in your product, recreate the site and you have all the content from your live site available locally.
If your tests use plone.app.testing then they can install GenericSetup profiles. This means integration tests can be written that refer to the test content too. I'll probably show this in another post.
More automation, more fun!
As useful as it is, updating your example content by clicking all the ZMI buttons and untarring the content in the right place quickly gets dull. We can use curl can automate this though. curl is a command-line tool for interacting with webservers, and we can combine the request to export content with an untar into the right place. For example:-
curl --user admin:admin \ --data 'ids%3Adefault%3Atokens=' \ --data 'ids%3Alist=content_quinta' \ --data 'manage_exportSelectedSteps%3Amethod=+Export+selected+steps+' \ http://localhost:8080/Plone/portal_setup \ | tar -zxvC src/shuttlethread/farmyard/profiles/testfixture/
In our projects we have some extra bash script to empty the testfixture directory first, but this isn't ideal. I'm going to work on a buildout recipe to add a bin/exportsite script in a more reusable fashion.
Workflow caveats
We have used transmogrify.dexterity to import/export most field types without hiccups. However, one thing is the default pipelines do not know much about workflow. The export pipeline does not record the current state of content, and the import pipeline will try and transition everything to 'publish'. This works fine with the default workflow but if you have your own workflow you will need to make a copy of import.cfg and change this.
You can at least override the transitions a content item goes through. Most useful if you need to test the behaviour of an unpublished content item. Within it's _content.json add...
"_transitions": [],
...to disable the default transition to published. Note that another export will overwrite this though.