Mailbag_Transistor_Banner

Backing Up Resolve Databases & Your Computer

February 20, 2018

One of our members asks about backing up Resolve databases and computers. This is an essential skill for colorists, listen to how we do it.


Series

From The MailBag Episode 45:  Back It Up!

We’ve all been there – lost projects, media offline, a knot in the throat, and sour stomach realizing you might be screwed.

If you’ve worked in production or post long enough, you know that back-up of projects and media is an essential life skill!  In this episode of From The Mailbag, we answer a question from Mixing Light member Per:

‘Hi, could you help me to set up and make a backup-system on my “film computer”? Can it be done on some sort of remote program? I lost a project just now and I am afraid of losing more, and plus, I’m a subscriber 🙂 Please help! Regards..Per’

Thanks for the question Per!  A good backup routine is vital to surviving the always looming computer or storage crash. Not only does backup prevent (hopefully) pain in the butt situations, in many cases, it can mean getting paid or not, or keeping a client or losing one. So yes, backup of databases and your computer is important!

In this episode, we discuss backup – both for databases as well as for system backup, and discuss different ways to handle these tasks.

Got a question or something you’d like Team Mixing Light to discuss?

Remember, if you have questions that you’d like to get our opinion on please use the contact form. Your question may be included in a future episode of From The Mailbag.

Your questions can be aesthetic, technical or even client related. We’d love to hear from you, and your question might make future episodes of From The MailBag.

Enjoy the Mailbag!

-Team Mixing Light

Member Content

Sorry... the rest of this content is for members only. You'll need to login or Join Now to continue (we hope you do!).

Need more information about our memberships? Click to learn more.

Membership options
Member Login

Are you using our app? For the best experience, please login using the app's launch screen


Comments

Homepage Forums Backing Up Resolve Databases & Your Computer

  • For those using Resolve’s PostgreSQL database option, the insight mentioned at 5:17 about Dan’s way to automate the backups is ML0223: http://bit.ly/2BG2T3x

  • Hi guys,

    Thanks for spending time on this important subject, which especially for beginners , but even for advanced Resolve users who are typically far from seeing IT engineers, can be a challenge or worse an oversight.
    For me its my core job as have been and still am woking for about 20 years as IT specialist/architect for some of the worlds largest corporations advising them on and building backup, recovery and disaster recovery solution. Most of these core financial systems which require an extreme rigid approach with billions at stake in case the dark mass hits the ventilator so to speak.
    One thing that always without a single exception , and here again, strikes me when discussing this subject , and is not specific to the industry i am in , is that the approach seems always backwards.
    You talk about backups left , backups right etc etc. My advise and one which tend to better lead to solutions that actualy give you what you want/need rather then based on the tools at hand is more focused at “restore”.
    There are 3 key question you need to ask yourselves and the results will then lead to technical solutions/software/methods etc . Not the other way around where you start looking at available software and see what it can do for you.
    1. How much data (in time) can you afford to loose maximum before your business is seriously affected .
    2. How much time can you afford to restore your data , before beeing fully operational again and before your business is seriously affected.
    3. How long do you need to store your backups for (auditing/tax/etc/etc reasons come in play here)

    Now the 1st reply you get instinctively
    1: none.
    2: zero.
    3: forever

    That is fine if its is really true which it hardly ever is , but then the resulting solution will probably outrun your entire yearly earnings , be extremely complex and might be overkill for what you actualy should have replied.. So be honest. The scale between both ends tend to be exponential in costs, complexity and implementation / maintenance efforts.
    1 and 2 will together determine complexity and costs while 3 is purely storage costs , which is nowadays not the biggest factor anymore and for typical sizes of resolve database the least important point (different story for DB’s of hundreds of TB’s with archives on LTO tapes and spread over multiple datacenters)
    For colorists these questions can even be asked on a per project base as may be vastly different if you work on your aunts cat movie or on the next Marvel sequel. A regular re-evaluation of these core question wont hurt.

    Realistic values for example of a multi billion companie’s prime financial database i recently build a solution for is 1: 5 minutes 2: 4 hours. 3: 3 years
    This leads to a solution that can guarantee these numbers and is tested / re-evaluated twice per year against these numbers and absolute functionality.

    Then once you have defined your key target numbers , have build your sollution , you need to test ,test , re-test regularly.
    At least 1 to 2 times per year a full complete system restore test simulating a total loss. And separately you can test sub segments of your solution more regularly.
    Without testing your solution, it is worth absolutely nothing. (sort of like the IDT rule where 1 backup is no back etc, but then reasoned from a restore point of view)

    This approach can be used from a simple single laptop user to enterprise size setups.

    For the more advanced users:
    I guess Dan had experimented with it in the past as he mentioned incremental backups, i want to point out 2 major advantages / features of using a postgreSQL database setup.
    First , due to the way modern databases (essentially a separate management layer for your data) handle changes , they are very rigid agains crashes of the software on top (eg Resolve). In case mid (auto)save something goes wrong, the database will revert the so-called uncommitted changes that belong together and you are left with a consistent state. Disk databases do not have this feature as rely purely on OS level read/write activities that do not understand the transactional relationships between a set of changes so very sensitive to DB corruption by software hickups.
    Second, you can set up the postgreSQL database so that it generates small delta files (generaly called db log files or archive logs) , that only contain the changes since the previous delta (or full backup if its the first). These can be used to restore your database to almost any point in time if you keep these files together with the regular full database backups made in the proper compatible way. They can be created based on a specific time window that has past (eg every 5 minutes) , an amount of data (1 MB of changes) or a combination of both.
    Ideally of course , you back these up to an external location (so not on the same server)

    If you want to go full OCD (like myself , but i am so far beyond OCD as its my main job) , specialy if you have a short windows to restore (question 2 from above) you can have these files automatically feed a second database that will then run in socalled standby mode.
    When your primary db system dies, you just start up the second , connect resolve to that and continue your work as if nothing happened within seconds/minutes. Happy customer guaranteed (assuming he is happy with the grade sofar). This concept works on any scale. I have it running on 2 mac mini’s , but can be 2 windows NUC’s or 2 (or more) linux behemoths spread over different locations depending your needs.
    You can use that same second system for restore tests, or just to recover a project that you lost from a DB 2 months ago without disturbing your primary database etc. A bit of automated monitoring / alerting on top and you are hard to kill.

    Anyway, if you ever need advise on more complex setups, just ping me .

  • My data back-up obsessiveness is pretty much at 11 as well but it has saved me numerous times, both due to hardware failure and accidental operator error. I’m very excited about the concept of live save, but nervous about it in practice opening up as corrupt after a crash. It seems like a crash would trash the database as well. Am I just used to a disk database mentality?

    For Window 10 image-based backup and recovery (similar to CCC) I’m currently using Paragon HDD Manager 16 but I’ve also had great results with Acronis True Image 20xx. I just happened to already be using Paragon v15 and have archives that I don’t want to make obsolete. For a free solution, there is pretty great image-backup app built right into Windows 10 called “Backup and Restore (Windows 7)”. It allows you to create scheduled system image back-ups, although its pretty basic settings wise. Macrium Reflect is another “free” solution but it inches a bit too close to that “ad for another every application they sell adware” category for my comfort, at least for a critical system utility.

    For disk databases, you COULD use the “File History” feature but its not worth the hit in system performance IMO. Rather just a SQL database, especially now that they include such a pretty GUI for backup restore.


  • Seth Goldin
    Member

    I’ve created a tool that lets you easily back up your
    PostgreSQL databases for Resolve on a schedule, but also lets you
    optimize on a schedule.

    Right now it’s designed to run on any
    PostgreSQL server running macOS 10.12.6 or macOS 10.13.4, but I should
    have it working on CentOS Linux very soon.

    https://github.com/sethgoldin/davinci-resolve-postgresql-workflow-tools

Log in to reply.

1,000+ Tutorials to Explore

Get full access to our entire library of over 1,100+ color tutorials for an entire week!


Start Your Test Drive!
Loading...