You need to create an audit log of all changes to customer banking data. You use DynamoDB to store this customer banking data. It’s important not to lose any information due to server failures. What is an elegant way to accomplish this?
and then logging to CloudWatch Logs will make the system resilient to instance and Availability Zone failures. http://docs.aws.amazon.com/lambda/latest/dg/with-ddb.html
A.
Use a DynamoDB StreamSpecification and stream all changes to AWS Lambda. Log the changes to AWS CloudWatch Logs,
removing sensitive information before logging.
B.
Before writing to DynamoDB, do a pre-write acknoledgment to disk on the application server, removing sensitive information before
logging. Periodically rotate these log files into S3.
C.
Use a DynamoDB StreamSpecification and periodically flush to an EC2 instance store, removing sensitive information before putting
the objects. Periodically flush these batches to S3.
D.
Before writing to DynamoDB, do a pre-write acknoledgment to disk on the application server, removing sensitive information before
logging. Periodically pipe these files into CloudWatch Logs.
Explanation:
Explanation/Reference:
All suggested periodic options are sensitive to server failure during or between periodic flushes. Streaming to Lambda