You need to create an audit log of all changes to customer banking data. You use DynamoDB to store this customer
banking data. It’s important not to lose any information due to server failures. What is an elegant way to accomplish this?
A.
Use a DynamoDB StreamSpecification and stream all changes to AWS Lambda. Log the changes to AWS CloudWatch Logs,
removing sensitive information before logging.
B.
Before writing to DynamoDB, do a pre-write acknoledgment to disk on the application server, removing sensitive information before
logging. Periodically rotate these log files into S3.
C.
Use a DynamoDB StreamSpecification and periodically flush to an EC2 instance store, removing sensitive information before putting
the objects. Periodically flush these batches to S3.
D.
Before writing to DynamoDB, do a pre-write acknoledgment to disk on the application server, removing sensitive information before
logging. Periodically pipe these files into CloudWatch Logs.
Explanation:
All suggested periodic options are sensitive to server failure during or between periodic flushes. Streaming to Lambda
and then logging to CloudWatch Logs will make the system resilient to instance and Availability Zone failures.
http://docs.aws.amazon.com/lambda/latest/dg/with-ddb.html
Agreed A.
http://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html
“DynamoDB Streams enables solutions such as these, and many others. DynamoDB Streams captures a time-ordered sequence of item-level modifications in any DynamoDB table, and stores this information in a log for up to 24 hours. Applications can access this log and view the data items as they appeared before and after they were modified, in near real time”
Answer A.
Dynamo DB stream captures changes to the items stored in Dynamo DB table, and streaming data to Lambda will preserve it (answer C, stores in instance store which is disk attached to server, so data will be lost in case of disaster)