VeeamON 2024 - Use Code "COMMUNITY10" for 10% Off!
This is brilliant @benyoung My thougths: Would you use this sort of thing, and what types of data would you want to see? Yes what questions would you ask it? More info on the VMS being backed up by a job. For example, ‘Can you tell me which VMs are being backed up by this particular job?’ Alternatively, ‘Is this VM being backed up by any particular job?’ ‘Can you tell me which job is going to be starting next?’ Would you let this automate tasks? i.e start/stop jobs, modify jobs, create repositories? Possibly only to Start jobs. Unless only certain people are able to make changes to jobs and there is an audit trail available Would you want it to generate anything for you? code? diagrams? Perhaps CLI commands that can be used to automate jobs A topology of what is connected to the VBR, as an example Cheers, Dipen Great feedback thanks Dips!
Nice one! Just a heads up, that link you shared, under “Get Started” I’m currently getting a couple of broken images! Nice catch, forgot to remove the additional content blocks, removed, docker image updated as the source/demo site
Yep, this is coming to v11 @vNote42 - brand new API that is decoupled from Enterprise Manager - woohoo!
In my demo there Powershell is really only used to visually trigger the mount, unfortunately there is no API call get for that but you could easily trigger this programmatically which I have done for another demo. Once mounted, my .net core (c#) code takes over the rest, so this could be python in your case. Sounds awesome and on a scale that I know you will have some (exciting) challenges to solve :)
Hey @vNote42 - Ive been playing with the BETA of V11, new API in there still very much under development but the good news is we can finally get this detail, here is an example from one off my lab jobs returning the session log. { "totalRecords": 21, "records": [ { "id": 21, "status": "ESucceeded", "startTime": "2020-12-03T00:08:47.3265379+13:00", "updateTime": "2020-12-03T00:08:47.3265379+13:00", "title": "Job finished at 3/12/2020 12:08:47 AM", "description": "" }, { "id": 20, "status": "ESucceeded", "startTime": "2020-12-03T00:08:47.1859263+13:00", "updateTime": "2020-12-03T00:08:47.1859263+13:00", "title": "Primary bottleneck: Source", "description": "" }, { "id": 19, "status": "ESucceeded", "startTime": "2020-12-03T00:08:47.1859263+13:00", "updateTime": "2020-12-03T00
At that scale your certainly going to need to come up with a solution that will involve scaling out multiple parts of the infrastructure, I believe there are limitations around how many sessions can be loaded simultaneously for starters. Youll also want an efficient way of traversing the filesystem that excludes already processed files, files that can't have classification performed, are you going to me performing the analysis on the mount server or have the file transit to be processed with the result stored. I'd love to know some more about what your trying to achieve, what tools and frameworks you would potentially be looking at as part of your pipeline (evening from a queuing/processing perspective). What's the approx size.of the dataset? So many questions!
Last I check that granular log around each disk read rates was not available. I have looked in the past about sourcing this info directly from the database or scraping job log files but in the end that information for our use case wasnt overly useful for our customers as they really want to see Success/failStart/end times of the job and each individual VM Type of restore point (less so)
You got the right idea there @BertrandFR , I've nearly got something in that space working and will share once I find some time to finish it off. So much scope with these instant recovery features as well as the Data Integration API, uses for these are only going to explode over the next wee while, what a time to be alive!
Your other option here would be to use to use the API as part of B&R Enterprise Manager. This is what i do to create custom backup job reports on our multi tenant platform so yours would be even easier and you could pull out the information you want specifically/present it your way (even via email, or a dashboard)Seen below is the dashboard for a test tenant on our portal, pulling data via the API (the slide out panel is when you click the little gear icon on the main table)
Already have an account? Login
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.
Sorry, we're still checking this file's contents to make sure it's safe to download. Please try again in a few minutes.
Sorry, our virus scanner detected that this file isn't safe to download.