Event Generation
The Amazon S3 destination can generate events that you can use in an event stream. When you enable event generation, the Amazon S3 destination generates event records each time after writing to an object or streaming a whole file.
In Data Collector Edge pipelines, the destination does not generate event records after streaming a whole file.
Amazon S3 events can be used in any logical way. For example:
- With the Amazon S3 executor to add metadata to closed objects or whole files after receiving an event.
- With the Spark executor to run a Spark application after receiving an event.
- With the Email executor to send a custom email
after receiving an event.
For an example, see Sending Email During Pipeline Processing.
- With a destination to store event information.
For an example, see Preserving an Audit Trail of Events.
For more information about dataflow triggers and the event framework, see Dataflow Triggers Overview.