One design question for my app where I've not understood if there is a Neo4J way of achieving the desired effect is....
As web based users create or delete nodes or relationships, in Neo4j through a web app, I want to kick off some Python scripts on one or more "worker" machines to perform processing. Mostly on creation of certain nodes, or changes to relationships, matching certain patterns, I want to go fetch some information from elsewhere and then update the node or create new nodes and relations based on data fetched by these scripts. All I really need is a "something changed" event, as nodes and relationships will have properties denotes when they were last refreshed. I want to process some requests quickly (although if I miss one having them queue, or even having a slower polling to catch those missed is fine).
Triggers are the closest I can think of, but I want a remote client to receive the event. However I see various stream processing interfaces which may be more suitable.
I want relatively light weight, in that I'll test it locally, and likely only have 1 consumer until or unless the project takes off. Event driven is best, as the events will be clustered, but most of the time no doubt it will all sit idle, so repeated polling will just waste CPU and electricity. However load balancing being built in is attractive longer term.
I can always just have a trigger kick a notification off across the network, but looking at the documentation there must be better ways.