If you ever follow any of our tips, blogs, or videos, please follow this one and enable multi-factor authentication on all your applications and websites that you access.
If you are procuring a new application or service, now is also a great time to verify that it includes forcing multi-factor authentication.
Hello, Everyone; this is John Sterrett from Procure SQL. Today, we will discuss how you can validate SQL Server Backups with a single line of PowerShell.
Due to the recent global IT outage, I thought this would be an excellent time to focus on the last line of defense—your database backups. I have good news if you are not validating your SQL Server backups today.
DbaTools Is Your Friend
Did you know you can validate your backups with a single PowerShell line? This is just one of several amazing things you can do with dbatools in a single line of PowerShell.
John, what do you mean by validating SQL Server backups?
I mean, find your last backup
See if the backup still exist
Restore the previous backup(s)
Run an Integrity Check
Document the time it took along the way
Validating SQL Server Backups – How Do We Validate Our Backups?
DBATools has a module named Test-DbaLastBackup.
You could run it with the following command to run against all your databases using the instance name provided below.
Great question! If we learned anything from the recent global IT downtime, it’s to validate and test everything!
I love to see what’s happening under the hood, so I set up an extended event trace to capture all the SQL statements running. I can see the commands used to find the backups, the restore, the integrity check, and the dropping of the database created during the restore.
All the excellent things I will share are below.
Extended Event
The following is the script for the extended event. I run this to capture events created by my DBATools command in Powershell. Once I start the extended event trace, I run the PowerShell command to do a single check on a database, as shown above. I then stop the capture and review.
CREATE EVENT SESSION [completed] ON SERVER
ADD EVENT sqlserver.sp_statement_completed(
ACTION(sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.database_name,sqlserver.query_hash,sqlserver.session_id,sqlserver.sql_text,sqlserver.username)
WHERE ([sqlserver].[like_i_sql_unicode_string]([sqlserver].[client_app_name],N'dbatools PowerShell module%'))),
ADD EVENT sqlserver.sql_batch_completed(
ACTION(sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.database_name,sqlserver.query_hash,sqlserver.session_id,sqlserver.sql_text,sqlserver.username)
WHERE ([sqlserver].[like_i_sql_unicode_string]([sqlserver].[client_app_name],N'dbatools PowerShell module%'))),
ADD EVENT sqlserver.sql_statement_completed(
ACTION(sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.database_name,sqlserver.query_hash,sqlserver.session_id,sqlserver.sql_text,sqlserver.username)
WHERE ([sqlserver].[like_i_sql_unicode_string]([sqlserver].[client_app_name],N'dbatools PowerShell module%')))
ADD TARGET package0.event_file(SET filename=N'completed',max_file_size=(50),max_rollover_files=(8))
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=OFF)
GO
Validate SQL Server Backups – Resulting Events / Statements
Here, you can see we captured many SQL Statements during this capture. Below, I will minimize and focus on key ones that prove what happens when you run this command for your database(s).
The query used to build up the backup set to find the last backup was too big to screenshot, so I included it below.
/* Get backup history */
SELECT
a.BackupSetRank,
a.Server,
'' as AvailabilityGroupName,
a.[Database],
a.DatabaseId,
a.Username,
a.Start,
a.[End],
a.Duration,
a.[Path],
a.Type,
a.TotalSize,
a.CompressedBackupSize,
a.MediaSetId,
a.BackupSetID,
a.Software,
a.position,
a.first_lsn,
a.database_backup_lsn,
a.checkpoint_lsn,
a.last_lsn,
a.first_lsn as 'FirstLSN',
a.database_backup_lsn as 'DatabaseBackupLsn',
a.checkpoint_lsn as 'CheckpointLsn',
a.last_lsn as 'LastLsn',
a.software_major_version,
a.DeviceType,
a.is_copy_only,
a.last_recovery_fork_guid,
a.recovery_model,
a.EncryptorThumbprint,
a.EncryptorType,
a.KeyAlgorithm
FROM (
SELECT
RANK() OVER (ORDER BY backupset.last_lsn desc, backupset.backup_finish_date DESC) AS 'BackupSetRank',
backupset.database_name AS [Database],
(SELECT database_id FROM sys.databases WHERE name = backupset.database_name) AS DatabaseId,
backupset.user_name AS Username,
backupset.backup_start_date AS Start,
backupset.server_name as [Server],
backupset.backup_finish_date AS [End],
DATEDIFF(SECOND, backupset.backup_start_date, backupset.backup_finish_date) AS Duration,
mediafamily.physical_device_name AS Path,
backupset.backup_size AS TotalSize,
backupset.compressed_backup_size as CompressedBackupSize,
encryptor_thumbprint as EncryptorThumbprint,
encryptor_type as EncryptorType,
key_algorithm AS KeyAlgorithm,
CASE backupset.type
WHEN 'L' THEN 'Log'
WHEN 'D' THEN 'Full'
WHEN 'F' THEN 'File'
WHEN 'I' THEN 'Differential'
WHEN 'G' THEN 'Differential File'
WHEN 'P' THEN 'Partial Full'
WHEN 'Q' THEN 'Partial Differential'
ELSE NULL
END AS Type,
backupset.media_set_id AS MediaSetId,
mediafamily.media_family_id as mediafamilyid,
backupset.backup_set_id as BackupSetID,
CASE mediafamily.device_type
WHEN 2 THEN 'Disk'
WHEN 102 THEN 'Permanent Disk Device'
WHEN 5 THEN 'Tape'
WHEN 105 THEN 'Permanent Tape Device'
WHEN 6 THEN 'Pipe'
WHEN 106 THEN 'Permanent Pipe Device'
WHEN 7 THEN 'Virtual Device'
WHEN 9 THEN 'URL'
ELSE 'Unknown'
END AS DeviceType,
backupset.position,
backupset.first_lsn,
backupset.database_backup_lsn,
backupset.checkpoint_lsn,
backupset.last_lsn,
backupset.software_major_version,
mediaset.software_name AS Software,
backupset.is_copy_only,
backupset.last_recovery_fork_guid,
backupset.recovery_model
FROM msdb..backupmediafamily AS mediafamily
JOIN msdb..backupmediaset AS mediaset ON mediafamily.media_set_id = mediaset.media_set_id
JOIN msdb..backupset AS backupset ON backupset.media_set_id = mediaset.media_set_id
JOIN (
SELECT TOP 1 database_name, database_guid, last_recovery_fork_guid
FROM msdb..backupset
WHERE database_name = 'CorruptionChallenge8'
ORDER BY backup_finish_date DESC
) AS last_guids ON last_guids.database_name = backupset.database_name AND last_guids.database_guid = backupset.database_guid AND last_guids.last_recovery_fork_guid = backupset.last_recovery_fork_guid
WHERE (type = 'D' OR type = 'P')
AND is_copy_only='0'
AND backupset.backup_finish_date >= CONVERT(datetime,'1970-01-01T00:00:00',126)
AND mediafamily.mirror='0'
) AS a
WHERE a.BackupSetRank = 1
ORDER BY a.Type;
The following is a screenshot of the results of validating a database with a good backup and no corruption.
What should I expect with a corrupted database?
Great question! I thought of the same one, so I grabbed a corrupt database from Steve Stedman’s Corruption Challenge and ran the experiment. I will admit my findings were not what I was expecting, either. This is why you shouldn’t take candy from strangers or run scripts without testing them in non-production and validating their results.
After restoring the corrupted database that had been successfully backed up, I performed a manual integrity check to validate that it would fail, as shown below.
Hopefully, you will have a process or tool to monitor your SQL Server error log and alert you of errors like these. If you duplicate this example, your process or tool will pick up these Severity 16 errors for corruption. I would validate that as well.
Validate SQL Server Backups – PowerShell Results
Okay, was I the only one who expected to see Failed as the status for the integrity check (DBCCResult)?
Instead, it’s blank, as I show below. So, when you dump these results back out, make sure to make your check for anything other than Success.
I had some other questions, too, which are answered on the official documentation page for the Test-DbaLastBackup command. I will list them below, but you can review the documentation to find the answers.
What if I want to test the last full backup?
What if I want to test the last full and last differential?
What if I wanted to offload the validation process to another server?
What if I don’t want to drop the database but use this to restore for non-production testing?
What if I wanted to do physical only for the integrity check?
What if I want to do performance testing of my restore process by changing the Max Transfer Size and Buffer Counts?
What questions did we miss that you would like answered? Let us know in the comments.
https://procuresql.com/wp-content/uploads/2024/07/ExtendedEventTrace.jpg5781570John Sterrett/wp-content/uploads/2024/05/Data-Architecture-as-a-Service-with-ProcureSQL.pngJohn Sterrett2024-07-21 22:52:132025-01-02 19:06:37Validating Your SQL Server Backups Made Easy!
Why Did You Sponsor Kansas City Developers Conference?
This is a great questions which leads to a good story. Back in 2012 Jeff Strass and Michael Eaton had a half day workshop on Going Independent. It was this workshop which helped push John to start Procure SQL. Its an honor to sponsor this event and it will always have a special place in the owners heart.
Kansas City Developers need help with your data?
If you need any help with SQL, NOSQL, Modern Data Warehousing, Reporting we would love to chat with you.
Database Management Myths for Developers
Today John & Kon gave our Database Management Myths for Developers talk. You can find the slides below.
“Every business workflow in every enterprise will be engineered with GenAI at its core” -ServiceNow’s Bill McDermott
Microsoft Build 2024 focused on transformative advancements in AI, cloud computing, and developer tools. This year’s event showcased Microsoft’s commitment to pushing the boundaries of what’s possible.
AI and Copilots were the overwhelming theme. Even though AI has hit mainstream for a while now, jumping on board now would still make you an early adopter and could give you some advantages within your market. With that being said, let’s delve into the key announcements and their implications for the data and application development space.
Copilots
Teams Copilot was introduced as a powerful enhancement for Microsoft Teams, designed to revolutionize the way teams collaborate. Leveraging advanced AI capabilities, Teams Copilot assists users by summarizing conversations, generating meeting agendas, and even drafting responses during discussions. This intelligent assistant integrates seamlessly within Teams, helping to streamline communication and enhance productivity by reducing the time spent on administrative tasks. With Teams Copilot, organizations can ensure that their teams are more focused on strategic initiatives, ultimately driving better outcomes and staying ahead of competitors​
You can now create and deploy custom AI agents with ease within Copilot Studio. Copilot Studio offers a robust set of tools for building intelligent agents that can automate complex tasks, streamline workflows, and enhance productivity. With the new agent capabilities, developers can design agents to interact seamlessly with various applications, providing users with context-aware assistance and real-time insights. These AI agents leverage advanced machine learning models and natural language processing to understand and respond to user inputs effectively. This allows businesses to create tailored solutions that can handle customer inquiries, manage routine tasks, and provide valuable data-driven insights, all while maintaining high levels of accuracy and efficiency.
Real-Time Intelligence in Microsoft Fabric
One of the most groundbreaking announcements was the introduction of Real-Time Intelligence within Microsoft Fabric. This end-to-end SaaS solution enables businesses to process high-volume, time-sensitive data at the point of ingestion, facilitating faster and more informed decision-making. Real-Time Intelligence is designed to support both low-code and code-rich experiences, making it a versatile tool for analysts and developers alike. For our data analytics team, this means we can build more responsive analytics solutions that provide immediate insights, enhancing our ability to drive strategic decisions based on your real-time data​.
Enhancements in GitHub Copilot
GitHub Copilot, already a game-changer for developers, received significant upgrades with the introduction of new extensions. These extensions, developed by Microsoft and third-party partners, integrate seamlessly with services like Azure, Docker, and Sentry. For our custom app development projects, this means we can leverage natural language capabilities within Copilot to manage Azure resources, troubleshoot issues, and streamline our development workflows. This integration not only boosts productivity but also enhances the efficiency of our development processes​.
Advances in Azure AI
Azure AI continues to evolve with the availability of GPT-4o, a multimodal AI model capable of processing text, images, and audio. Additionally, Microsoft introduced Phi-3-vision, a new model in the Phi-3 family, which is optimized for personal devices and offers powerful capabilities for text and image input. These models are accessible through Azure AI Studio, providing us with advanced tools to experiment and build innovative AI solutions. For our Data Analytics projects, these models can offer new ways to interact with and analyze data, enabling us to identify patterns and gain insights that can help us stay ahead of competitors. By leveraging these advanced AI tools, you can uncover hidden trends, make more informed decisions, and ultimately drive a greater strategic advantage against your competitors.
Smart Components
These components represent a significant leap forward in streamlining UI development within the .NET ecosystem. Smart Components are designed to automatically adapt to varying contexts and states, reducing the need for boilerplate code and extensive conditional logic. By leveraging advanced AI and machine learning, Smart Components can intelligently adjust their behavior and appearance based on real-time data and user interactions. This innovation simplifies the development process, enabling developers to create more dynamic and responsive applications with less effort. Smart Components can be particularly beneficial for building complex interfaces where different parts of the application need to interact seamlessly. They also enhance maintainability and scalability, as developers can rely on these components to handle many of the intricacies involved in state management and UI rendering.
Honorable Mentions
Additionally, .NET 9 Preview 4 was released, offering a glimpse into the future of the platform with numerous performance improvements, enhanced security features, and expanded support for cloud-native applications.
The announcement of C# 13 brought a host of new features aimed at making the language more expressive and user-friendly. Notable enhancements include improvements in pattern matching, interpolated string handlers, and extended lambda expressions, all designed to simplify coding and increase developer efficiency.
All of these advancements collectively underscore Microsoft’s dedication to evolving the data and .NET ecosystem, making it an even more robust and efficient environment for developers to build robust cutting-edge data solutions​.
WebNN (Web Neural Network API) was highlighted as a cutting-edge technology designed to bring advanced machine learning capabilities directly to web applications. WebNN allows developers to run neural network models efficiently in the browser, enabling real-time AI-powered experiences without relying heavily on server-side processing.
Conclusion
These features promise to revolutionize the way we interact with data, build applications, and drive business success. Staying up-to-date on all of these developments is crucial for any company aiming to maintain a competitive edge in today’s fast-paced digital landscape.
At Procure SQL, we are dedicated to helping businesses harness these cutting-edge technologies. Whether you need to integrate AI capabilities, enhance your data analytics, or develop custom applications using .NET, our expertise can guide you through the process. Let us assist you in leveraging these new and upcoming features to stay ahead of the game and achieve your strategic objectives. Contact us to learn more about how we can support your data journey.
Free Azure and SQL Server Training in Austin, Texas!
On Saturday, March 9, 2024, SQL Saturday will be coming to Austin, Texas. SQL Saturday is a free training day around SQL Server, Azure, and the Microsoft Data Platform. If you want lunch, it’s $20. We will also have two all-day deep dive training classes on performance tuning and Microsoft Analytics on Friday, March 8, 2024, for $125.
Need a Remote DBA or Data Architect?
Have you got questions? Need some help? Are you curious to know the cost of procuring a Remote Data Architect?
Checkout this quick video to see how you can start to load and transform your data with notebooks. Data Engineering with Microsoft Fabric becomes easier once you understand how to leverage notebooks to your advantage.
If you are in Austin Texas on March 8 & 9, 2024 don’t miss SQL Saturday Ausin where you can learn more about Microsoft Fabric, PowerBI, SQL Server and more.
These are things to know before you jump into Serverless for Hyperscale. Serverless auto-pausing and resuming in Hyperscale are not currently available. The provisioned compute tier may be less expensive if CPU or memory usage is high enough and sustained long enough.
Free Azure and SQL Server Training in Austin, Texas!
On Saturday, March 9, 2024, SQL Saturday will be coming to Austin, Texas. This is a free training day around SQL Server, Azure, and the Microsoft Data Platform. If you would like lunch to be provided, it’s $20. We will also have two all-day deep dive training classes on performance tuning and Microsoft Analytics on Friday, March 8, 2024, for $125.
Need a Remote DBA or Data Architect?
Have you got questions? Need some help? Are you curious to know the cost of procuring a Remote Data Architect?
Here is some news, videos, tips, and links the Procure SQL team would like to share!
Procure SQL made it out to their first trade show of 2024! Justin, Kon and John were at SQL Saturday Atlanta BI on February 10th.
Justin Cunningham gave a talk on Data Catalog: Visualizing Your Data Sprawl. John Sterrett gave a talk on Things to Know Before Going Independent.
The team is excited to be back on April 20th for SQL Saturday Atlanta. You can also catch them at SQL Saturday Austin on March 10th.
Justin Cunningham talked about Data Sprawl and Managing Your Metadata.
Procure SQL Team sponsored SQL Saturday Atlanta BI on February 10, 2024. Kon Melamud, John Sterrett, and Justin Cunningham had a great time meeting everyone.
Someone’s Dream Job
Dream of being a researcher for the Microsoft data systems? Good, they’re hiring.
Free Azure and SQL Server Training in Austin, Texas!
On Saturday, March 9, 2024, SQL Saturday will be coming to Austin, Texas. This is a free training day around SQL Server, Azure, and the Microsoft Data Platform. If you would like lunch to be provided, it’s $20. We will also have two all-day deep dive training classes on performance tuning and Microsoft Analytics on Friday, March 8, 2024, for $125.
Need a Remote DBA or Data Architect?
Have you got questions? Need some help? Are you curious to know the cost of procuring a Remote Data Architect?
Free Azure and SQL Server Training in Austin, Texas!
On Saturday, March 9, 2024, SQL Saturday will be coming to Austin, Texas. This is a free training day around SQL Server, Azure, and the Microsoft Data Platform. If you would like lunch to be provided, it’s $20. We will also have two all-day deep dive training classes on performance tuning and Microsoft Analytics on Friday, March 8, 2024, for $125.
Need a Data Architect?
Have you got questions? Need some help? Are you curious to know the cost of procuring a Remote Data Architect?
In five minutes, Justin gives you a Microsoft Fabric overview to help you ingest, process, store, monitor, and govern your data with Microsoft’s new analytics tool. He will discuss user personas like Data Engineering and Data Science within Microsoft Fabric and how the personas will use tools like Data Activator, Power BI, Data Factory, Real-Time Analytics, Notebooks, Spark, Lakehouse, and more.
In five minutes, Justin at Procure SQL breaks down how Microsoft Fabric tools help you ingest, process, store, monitor, and govern your data with Microsoft’s new analytics tool.
Got Microsoft Fabric or Analytics Questions?
We would love to discuss your questions from watching this Microsoft Fabric overview video or anything related to the Microsoft Data Platform. Fill out the form below, and we will be in touch soon.
https://procuresql.com/wp-content/uploads/2018/09/Procure-SQL-RTO-RPO-1.jpg5611076Justin Cunningham/wp-content/uploads/2024/05/Data-Architecture-as-a-Service-with-ProcureSQL.pngJustin Cunningham2024-02-06 21:39:082024-02-06 21:39:08Microsoft Fabric Overview in Five Minutes