This blog post documents my learning for a data migration from Dynamics CRM on premise to Dynamics 365 online, using the KingswaySoft SSIS Toolkit.
The data migration consisted of 40 entities – three of which contained over 1 million records. With the current API limits imposed by Dynamics 365, there were some challenges to overcome.
Dynamics 365 imposes API limits. There are limits imposed per 24-hour period as well as over a five-minute rolling window in which case you will see concurrency errors thrown back if you are hitting the servers too hard. If you hit concurrency issues, the server will respond with a number of milliseconds that you can wait before trying the request again.
It is important to use the latest version of the KingswaySoft SSIS Toolkit for Dynamics 365 for your data migrations. The latest version includes built-in retry logic to minimise the concurrency issues as described above. However, this doesn’t mean that API limit issues are completely removed as the retried request could also fail.
The KingswaySoft FAQ contains some useful information to performance tune your integrations and data migrations. I found that a batch size of 10 with 10 threads gave me the best results in terms of performance, but for my large entities (over 1 million records), I would run into concurrency issues.
I had to reduce the number of threads and increase the batch size to slow down the migration and reduce the number of requests. Of course, this adds more time to the overall process, but data accuracy is more important than performance in this case.
Another strategy is to run components as different users. For example, I have one destination component to create Invoices and another to create Invoice Lines. I found that by creating Invoices as one user and Invoice Lines as another user, the concurrency issues were minimised.
You can raise a support ticket with Microsoft to increase (but not remove) the API limits on a Dynamics 365 instance. When this was done, I found there were much less concurrency issues being reported, so I still had to fiddle with batch and thread settings.
There is one very important thing to note. Because I was dealing with millions of records, I found that it was faster to reset the instance to re-test the migration rather than run bulk deletion jobs to delete the data that had already gone in. If you reset the instance, the new instance will have a new organisation ID, and the API limits increased by Microsoft will not be applied to the new instance – they will only be applied to the old (disabled) instance. I had to ask Microsoft to apply the same changes to the reset instance.
Although there are many different scenarios for data migration, it is important to do the following: