Import/Export data from DynamoDB (small amounts)

I use this CLI command for exporting and then later importing data from/to DynamoDB (small amounts of data – eg. upto a few hundred items).

It does take a while to get data back into DynamoDB, as it’s doing it line-by-line, rather than as a batch … but gets the job done!

Export;

aws dynamodb scan --table-name source-table-name --no-paginate > data.json

Import;

cat data.json | jq -c '.Items[]' | while read -r line; do aws dynamodb put-item --table-name destination-table-name --item "$line"; done

This can be done in one line as well;

aws dynamodb scan --table-name source-table-name --no-paginate | jq -c '.Items[]' | while read -r line; do aws dynamodb put-item --table-name destination-table-name --item "$line"; done

Credit goes to; https://github.com/guillaumesmo

Cleaning up old DynamoDB Auto-Scaling Resources

I’ve found a strange problem with cloud-formation roll-backs which don’t automatically remove any Auto Scaling resources you might have setup.

This means then when you next deploy, CloudFormation starts complaining about resources already existing!

To clean these up, you need the following (run from the CLI, using the AWS CLI);

List resources;

aws application-autoscaling describe-scalable-targets --service-namespace dynamodb

From there, de-register (remove) each of the ones which shouldn’t be there;

aws application-autoscaling deregister-scalable-target --service-namespace dynamodb --resource-id "table/myTableName" --scalable-dimension "dynamodb:table:ReadCapacityUnits"

aws application-autoscaling deregister-scalable-target --service-namespace dynamodb --resource-id "table/myTableName" --scalable-dimension "dynamodb:table:WriteCapacityUnits"

That’s it!