This year, several data breaches were caused by multi-factor authentication NOT being enabled.

Enable Multi-Factor Authentication, Please!
Enable Multi-Factor Authentication, Please!

If you ever follow any of our tips, blogs, or videos, please follow this one and enable multi-factor authentication on all your applications and websites that you access.

If you are procuring a new application or service, now is also a great time to verify that it includes forcing multi-factor authentication.

Hello, Everyone; this is John Sterrett from Procure SQL. Today, we will discuss how you can validate SQL Server Backups with a single line of PowerShell.

Due to the recent global IT outage, I thought this would be an excellent time to focus on the last line of defense—your database backups. I have good news if you are not validating your SQL Server backups today.

DbaTools Is Your Friend

Did you know you can validate your backups with a single PowerShell line? This is just one of several amazing things you can do with dbatools in a single line of PowerShell.

John, what do you mean by validating SQL Server backups?

  • I mean, find your last backup
  • See if the backup still exist
  • Restore the previous backup(s)
  • Run an Integrity Check
  • Document the time it took along the way

Validating SQL Server Backups – How Do We Validate Our Backups?

DBATools has a module named Test-DbaLastBackup.

You could run it with the following command to run against all your databases using the instance name provided below.

$result = Test-DbaLastBackup -SqlInstance serverNameGoesHere

You could also have it run for a single database with a command similar to the one below.

$result = Test-DbaLastBackup -SqlInstance serverNameGoesHere -Database ProcureSQL

What happens with Test-DbaLastBackup?

Great question! If we learned anything from the recent global IT downtime, it’s to validate and test everything!

I love to see what’s happening under the hood, so I set up an extended event trace to capture all the SQL statements running. I can see the commands used to find the backups, the restore, the integrity check, and the dropping of the database created during the restore.

All the excellent things I will share are below.

Extended Event

The following is the script for the extended event. I run this to capture events created by my DBATools command in Powershell. Once I start the extended event trace, I run the PowerShell command to do a single check on a database, as shown above. I then stop the capture and review.

CREATE EVENT SESSION [completed] ON SERVER 
ADD EVENT sqlserver.sp_statement_completed(
  ACTION(sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.database_name,sqlserver.query_hash,sqlserver.session_id,sqlserver.sql_text,sqlserver.username)
    WHERE ([sqlserver].[like_i_sql_unicode_string]([sqlserver].[client_app_name],N'dbatools PowerShell module%'))),
ADD EVENT sqlserver.sql_batch_completed(
    ACTION(sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.database_name,sqlserver.query_hash,sqlserver.session_id,sqlserver.sql_text,sqlserver.username)
    WHERE ([sqlserver].[like_i_sql_unicode_string]([sqlserver].[client_app_name],N'dbatools PowerShell module%'))),
ADD EVENT sqlserver.sql_statement_completed(
    ACTION(sqlserver.client_app_name,sqlserver.client_hostname,sqlserver.database_name,sqlserver.query_hash,sqlserver.session_id,sqlserver.sql_text,sqlserver.username)
    WHERE ([sqlserver].[like_i_sql_unicode_string]([sqlserver].[client_app_name],N'dbatools PowerShell module%')))
ADD TARGET package0.event_file(SET filename=N'completed',max_file_size=(50),max_rollover_files=(8))
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=OFF)
GO

Validate SQL Server Backups – Resulting Events / Statements

Here, you can see we captured many SQL Statements during this capture. Below, I will minimize and focus on key ones that prove what happens when you run this command for your database(s).

The query used to build up the backup set to find the last backup was too big to screenshot, so I included it below.

/* Get backup history */
SELECT
                        a.BackupSetRank,
                        a.Server,
                        '' as AvailabilityGroupName,
                        a.[Database],
                        a.DatabaseId,
                        a.Username,
                        a.Start,
                        a.[End],
                        a.Duration,
                        a.[Path],
                        a.Type,
                        a.TotalSize,
                        a.CompressedBackupSize,
                        a.MediaSetId,
                        a.BackupSetID,
                        a.Software,
                        a.position,
                        a.first_lsn,
                        a.database_backup_lsn,
                        a.checkpoint_lsn,
                        a.last_lsn,
                        a.first_lsn as 'FirstLSN',
                        a.database_backup_lsn as 'DatabaseBackupLsn',
                        a.checkpoint_lsn as 'CheckpointLsn',
                        a.last_lsn as 'LastLsn',
                        a.software_major_version,
                        a.DeviceType,
                        a.is_copy_only,
                        a.last_recovery_fork_guid,
                        a.recovery_model,
                        a.EncryptorThumbprint,
                        a.EncryptorType,
                        a.KeyAlgorithm
                    FROM (
                        SELECT
                        RANK() OVER (ORDER BY backupset.last_lsn desc, backupset.backup_finish_date DESC) AS 'BackupSetRank',
                        backupset.database_name AS [Database],
                        (SELECT database_id FROM sys.databases WHERE name = backupset.database_name) AS DatabaseId,
                        backupset.user_name AS Username,
                        backupset.backup_start_date AS Start,
                        backupset.server_name as [Server],
                        backupset.backup_finish_date AS [End],
                        DATEDIFF(SECOND, backupset.backup_start_date, backupset.backup_finish_date) AS Duration,
                        mediafamily.physical_device_name AS Path,
                        
                backupset.backup_size AS TotalSize,
                backupset.compressed_backup_size as CompressedBackupSize,
                encryptor_thumbprint as EncryptorThumbprint,
                encryptor_type as EncryptorType,
                key_algorithm AS KeyAlgorithm,
                        CASE backupset.type
                        WHEN 'L' THEN 'Log'
                        WHEN 'D' THEN 'Full'
                        WHEN 'F' THEN 'File'
                        WHEN 'I' THEN 'Differential'
                        WHEN 'G' THEN 'Differential File'
                        WHEN 'P' THEN 'Partial Full'
                        WHEN 'Q' THEN 'Partial Differential'
                        ELSE NULL
                        END AS Type,
                        backupset.media_set_id AS MediaSetId,
                        mediafamily.media_family_id as mediafamilyid,
                        backupset.backup_set_id as BackupSetID,
                        CASE mediafamily.device_type
                        WHEN 2 THEN 'Disk'
                        WHEN 102 THEN 'Permanent Disk Device'
                        WHEN 5 THEN 'Tape'
                        WHEN 105 THEN 'Permanent Tape Device'
                        WHEN 6 THEN 'Pipe'
                        WHEN 106 THEN 'Permanent Pipe Device'
                        WHEN 7 THEN 'Virtual Device'
                        WHEN 9 THEN 'URL'
                        ELSE 'Unknown'
                        END AS DeviceType,
                        backupset.position,
                        backupset.first_lsn,
                        backupset.database_backup_lsn,
                        backupset.checkpoint_lsn,
                        backupset.last_lsn,
                        backupset.software_major_version,
                        mediaset.software_name AS Software,
                        backupset.is_copy_only,
                        backupset.last_recovery_fork_guid,
                        backupset.recovery_model
                        FROM msdb..backupmediafamily AS mediafamily
                        JOIN msdb..backupmediaset AS mediaset ON mediafamily.media_set_id = mediaset.media_set_id
                        JOIN msdb..backupset AS backupset ON backupset.media_set_id = mediaset.media_set_id
                        JOIN (
                            SELECT TOP 1 database_name, database_guid, last_recovery_fork_guid
                            FROM msdb..backupset
                            WHERE database_name = 'CorruptionChallenge8'
                            ORDER BY backup_finish_date DESC
                            ) AS last_guids ON last_guids.database_name = backupset.database_name AND last_guids.database_guid = backupset.database_guid AND last_guids.last_recovery_fork_guid = backupset.last_recovery_fork_guid
                    WHERE (type = 'D' OR type = 'P')
                     AND is_copy_only='0' 
                    
                    AND backupset.backup_finish_date >= CONVERT(datetime,'1970-01-01T00:00:00',126)
                    
                     AND mediafamily.mirror='0' 
                    ) AS a
                    WHERE a.BackupSetRank = 1
                    ORDER BY a.Type;

The following is a screenshot of the results of validating a database with a good backup and no corruption.

 

What should I expect with a corrupted database?

Great question! I thought of the same one, so I grabbed a corrupt database from Steve Stedman’s Corruption Challenge and ran the experiment. I will admit my findings were not what I was expecting, either. This is why you shouldn’t take candy from strangers or run scripts without testing them in non-production and validating their results.

After restoring the corrupted database that had been successfully backed up, I performed a manual integrity check to validate that it would fail, as shown below.

Hopefully, you will have a process or tool to monitor your SQL Server error log and alert you of errors like these. If you duplicate this example, your process or tool will pick up these Severity 16 errors for corruption. I would validate that as well.

Validate SQL Server Backups – PowerShell Results

Okay, was I the only one who expected to see Failed as the status for the integrity check (DBCCResult)?

Instead, it’s blank, as I show below. So, when you dump these results back out, make sure to make your check for anything other than Success.

I submitted a bug to DbaTools and post back here with any updates.

Other Questions….

I had some other questions, too, which are answered on the official documentation page for the Test-DbaLastBackup command. I will list them below, but you can review the documentation to find the answers.

  • What if I want to test the last full backup?
  • What if I want to test the last full and last differential?
  • What if I wanted to offload the validation process to another server?
  • What if I don’t want to drop the database but use this to restore for non-production testing?
  • What if I wanted to do physical only for the integrity check?
  • What if I want to do performance testing of my restore process by changing the Max Transfer Size and Buffer Counts?

What questions did we miss that you would like answered? Let us know in the comments.

Procure SQL was at the Kansas City Developers Conference to help people procure the right Data Architecture partner.
John and Kon at Kansas City Developer Conference

Why Did You Sponsor Kansas City Developers Conference?

This is a great questions which leads to a good story. Back in 2012 Jeff Strass and Michael Eaton had a half day workshop on Going Independent. It was this workshop which helped push John to start Procure SQL. Its an honor to sponsor this event and it will always have a special place in the owners heart.

Kansas City Developers need help with your data?

If you need any help with SQL, NOSQL, Modern Data Warehousing, Reporting we would love to chat with you.

Database Management Myths for Developers

Today John & Kon gave our Database Management Myths for Developers talk. You can find the slides below.

“Every business workflow in every enterprise will be engineered with GenAI at its core” -ServiceNow’s Bill McDermott

Microsoft Build 2024 focused on transformative advancements in AI, cloud computing, and developer tools. This year’s event showcased Microsoft’s commitment to pushing the boundaries of what’s possible.

AI and Copilots were the overwhelming theme. Even though AI has hit mainstream for a while now, jumping on board now would still make you an early adopter and could give you some advantages within your market. With that being said, let’s delve into the key announcements and their implications for the data and application development space.

Copilots

Teams Copilot was introduced as a powerful enhancement for Microsoft Teams, designed to revolutionize the way teams collaborate. Leveraging advanced AI capabilities, Teams Copilot assists users by summarizing conversations, generating meeting agendas, and even drafting responses during discussions. This intelligent assistant integrates seamlessly within Teams, helping to streamline communication and enhance productivity by reducing the time spent on administrative tasks. With Teams Copilot, organizations can ensure that their teams are more focused on strategic initiatives, ultimately driving better outcomes and staying ahead of competitors​

You can now create and deploy custom AI agents with ease within Copilot Studio. Copilot Studio offers a robust set of tools for building intelligent agents that can automate complex tasks, streamline workflows, and enhance productivity. With the new agent capabilities, developers can design agents to interact seamlessly with various applications, providing users with context-aware assistance and real-time insights. These AI agents leverage advanced machine learning models and natural language processing to understand and respond to user inputs effectively. This allows businesses to create tailored solutions that can handle customer inquiries, manage routine tasks, and provide valuable data-driven insights, all while maintaining high levels of accuracy and efficiency.

Real-Time Intelligence in Microsoft Fabric

One of the most groundbreaking announcements was the introduction of Real-Time Intelligence within Microsoft Fabric. This end-to-end SaaS solution enables businesses to process high-volume, time-sensitive data at the point of ingestion, facilitating faster and more informed decision-making. Real-Time Intelligence is designed to support both low-code and code-rich experiences, making it a versatile tool for analysts and developers alike. For our data analytics team, this means we can build more responsive analytics solutions that provide immediate insights, enhancing our ability to drive strategic decisions based on your real-time data​.

Enhancements in GitHub Copilot
GitHub Copilot, already a game-changer for developers, received significant upgrades with the introduction of new extensions. These extensions, developed by Microsoft and third-party partners, integrate seamlessly with services like Azure, Docker, and Sentry. For our custom app development projects, this means we can leverage natural language capabilities within Copilot to manage Azure resources, troubleshoot issues, and streamline our development workflows. This integration not only boosts productivity but also enhances the efficiency of our development processes​.

Advances in Azure AI

Azure AI continues to evolve with the availability of GPT-4o, a multimodal AI model capable of processing text, images, and audio. Additionally, Microsoft introduced Phi-3-vision, a new model in the Phi-3 family, which is optimized for personal devices and offers powerful capabilities for text and image input. These models are accessible through Azure AI Studio, providing us with advanced tools to experiment and build innovative AI solutions. For our Data Analytics projects, these models can offer new ways to interact with and analyze data, enabling us to identify patterns and gain insights that can help us stay ahead of competitors. By leveraging these advanced AI tools, you can uncover hidden trends, make more informed decisions, and ultimately drive a greater strategic advantage against your competitors.

Smart Components

These components represent a significant leap forward in streamlining UI development within the .NET ecosystem. Smart Components are designed to automatically adapt to varying contexts and states, reducing the need for boilerplate code and extensive conditional logic. By leveraging advanced AI and machine learning, Smart Components can intelligently adjust their behavior and appearance based on real-time data and user interactions. This innovation simplifies the development process, enabling developers to create more dynamic and responsive applications with less effort. Smart Components can be particularly beneficial for building complex interfaces where different parts of the application need to interact seamlessly. They also enhance maintainability and scalability, as developers can rely on these components to handle many of the intricacies involved in state management and UI rendering.

Honorable Mentions

Additionally, .NET 9 Preview 4 was released, offering a glimpse into the future of the platform with numerous performance improvements, enhanced security features, and expanded support for cloud-native applications.

The announcement of C# 13 brought a host of new features aimed at making the language more expressive and user-friendly. Notable enhancements include improvements in pattern matching, interpolated string handlers, and extended lambda expressions, all designed to simplify coding and increase developer efficiency.

All of these advancements collectively underscore Microsoft’s dedication to evolving the data and .NET ecosystem, making it an even more robust and efficient environment for developers to build robust cutting-edge data solutions​.

WebNN (Web Neural Network API) was highlighted as a cutting-edge technology designed to bring advanced machine learning capabilities directly to web applications. WebNN allows developers to run neural network models efficiently in the browser, enabling real-time AI-powered experiences without relying heavily on server-side processing.

Conclusion

These features promise to revolutionize the way we interact with data, build applications, and drive business success. Staying up-to-date on all of these developments is crucial for any company aiming to maintain a competitive edge in today’s fast-paced digital landscape.

At Procure SQL, we are dedicated to helping businesses harness these cutting-edge technologies. Whether you need to integrate AI capabilities, enhance your data analytics, or develop custom applications using .NET, our expertise can guide you through the process. Let us assist you in leveraging these new and upcoming features to stay ahead of the game and achieve your strategic objectives. Contact us to learn more about how we can support your data journey.


Here are some news, videos, tips, and links the Procure SQL team would like to share!

Scan vs. Seek

The most straightforward example explains the difference between a scan and a seek in execution plans.

Data Engineering with Notebooks

Watch Justin’s seven-minute video on loading and transforming data in Microsoft Fabric.

Is Tableau Dead?

Yes and No. The future looks mixed.

Most Recent Issues

The work done by DBAs and Data professionals is all over the map.

Apple Electric Car

Apple pulled the plug on their Apple electric vehicle.

Should you always listen to data?

The answer is a resounding NO!

The Wheel of Misfortune

Skyscanner used this game to increase engineers’ confidence in incident management with Open Telemetry. Learn how to ingest your application’s telemetry data into Azure Monitor.

Performance Testing

Ensure a consistent and reliable user experience with Azure Load Testing.

Maximize Your Savings with SQL Server

Are you using these options to develop or test for free or with substantial cost savings?

Free Azure and SQL Server Training in Austin, Texas!

On Saturday, March 9, 2024, SQL Saturday will be coming to Austin, Texas. SQL Saturday is a free training day around SQL Server, Azure, and the Microsoft Data Platform. If you want lunch, it’s $20. We will also have two all-day deep dive training classes on performance tuning and Microsoft Analytics on Friday, March 8, 2024, for $125.

Need a Remote DBA or Data Architect?

Have you got questions? Need some help? Are you curious to know the cost of procuring a Remote Data Architect?

Checkout this quick video to see how you can start to load and transform your data with notebooks. Data Engineering with Microsoft Fabric becomes easier once you understand how to leverage notebooks to your advantage.

If you are in Austin Texas on March 8 & 9, 2024 don’t miss SQL Saturday Ausin where you can learn more about Microsoft Fabric, PowerBI, SQL Server and more.

Procure SQL - Data Architect as a Service - Weekly Newsletter


Here are some news, videos, tips, and links the Procure SQL team would like to share!

 

Near Zero Downtime Migrations

Near zero downtime migrations are why Azure SQL Databases can be a subscriber for transactional replication.

Habits of Effective Data Leaders

How many of these seven effective habits do you see at your job?

AI Coming for Your Job?

Maybe. Regardless, let your skills, quality of work, and service to others define who you are.

Air Canada Chatbot Lawsuit

Air Canada lost a lawsuit claiming the airline should not be liable for their chatbot’s misleading information.

Testing and Bug Fixes

We hope this is different from how you do testing or bug fixes.

Offload Workload to Availability Group Replicas

Learn about temporary statistics on your secondary replicas. See how to enable Query Store for secondary replicas with SQL Server 2022.

Serverless for Hyperscale in Azure SQL Database

These are things to know before you jump into Serverless for Hyperscale. Serverless auto-pausing and resuming in Hyperscale are not currently available. The provisioned compute tier may be less expensive if CPU or memory usage is high enough and sustained long enough.

Free Azure and SQL Server Training in Austin, Texas!

On Saturday, March 9, 2024, SQL Saturday will be coming to Austin, Texas. This is a free training day around SQL Server, Azure, and the Microsoft Data Platform. If you would like lunch to be provided, it’s $20. We will also have two all-day deep dive training classes on performance tuning and Microsoft Analytics on Friday, March 8, 2024, for $125.

Need a Remote DBA or Data Architect?

Have you got questions? Need some help? Are you curious to know the cost of procuring a Remote Data Architect?

Procure SQL - Data Architect as a Service - Weekly Newsletter


Here is some news, videos, tips, and links the Procure SQL team would like to share!

Procure SQL made it out to their first trade show of 2024! Justin, Kon and John were at SQL Saturday Atlanta BI on February 10th.

Justin Cunningham gave a talk on Data Catalog: Visualizing Your Data Sprawl. John Sterrett gave a talk on Things to Know Before Going Independent.

The team is excited to be back on April 20th for SQL Saturday Atlanta. You can also catch them at SQL Saturday Austin on March 10th.

Procure SQL Sponsored SQL Saturday Atlanta BI on February 10, 2024. Justin Cunningham talked about Data Sprawl and Managing Your Metadata.
Justin Cunningham talked about Data Sprawl and Managing Your Metadata.
Procure SQL Team sponsored SQL Saturday Atlanta BI on February 10, 2024. Kon Melamud, John Sterrett, and Justin Cunningham had a great time meeting everyone.
Procure SQL Team sponsored SQL Saturday Atlanta BI on February 10, 2024. Kon Melamud, John Sterrett, and Justin Cunningham had a great time meeting everyone.

Someone’s Dream Job

Dream of being a researcher for the Microsoft data systems? Good, they’re hiring.

NASA’s Computer Glitch

ever wonder what it’s like to troubleshoot 1970’s tech that’s 15 billion miles away…

Data Sprawl

Interesting editorial about managing your metadata. This challenge gets harder when only 3% of company’s data meets data quality standards.

Power BI Desktop Projects

Martin Schoombee shares how DevOps and Report sharing gets easier with Power BI Desktop Projects. Power BI Desktop projects are going to open up many possibilities.

Microsoft Analytics Overview in Five Minutes

Justin created this video about the personas and tools behind Microsoft’s new shiny analytics tools.

Developers, Developers, Developers…

It’s time to make data the first choice in the technology stack, not an afterthought. Developers’ words not ours. We do agree with them though. 🙂

Working With Others

Aaron Bertrand has a very simple but great tip. Leave it better than you found it.

Free Azure and SQL Server Training in Austin, Texas!

On Saturday, March 9, 2024, SQL Saturday will be coming to Austin, Texas. This is a free training day around SQL Server, Azure, and the Microsoft Data Platform. If you would like lunch to be provided, it’s $20. We will also have two all-day deep dive training classes on performance tuning and Microsoft Analytics on Friday, March 8, 2024, for $125.

Need a Remote DBA or Data Architect?

Have you got questions? Need some help? Are you curious to know the cost of procuring a Remote Data Architect?

Procure SQL - Data Architect as a Service - Weekly Newsletter


Here are some videos, tips, and exciting links the Procure SQL team would like to share!

MLB’s Databases

Have you ever wondered about what MLB teams name their databases? We like “the fish bag,” “ground control,” and “MITT.” What is your favorite one?

Cloud Failures

Most cloud failures are traced back to human mistakes. Have you experienced or seen these top five mistakes?

SQL Server 2022’s New Disaster Recovery Solution

Daniel Taylor from Tales from the Field shares his video on how to set up bidirectional failover for SQL Server 2022 to Managed Instance using the MI Link for free.

State of Database Landscape in 2024

Data management is more complex, and people struggle to support multiple data platforms.

Microsoft Analytics Overview in Five Minutes

Check out this video about the personas and tools behind Microsoft’s new shiny analytics tools.

Win A Free Registration to PASS Conference

Everyone loves to win free stuff. More importantly, this is your opportunity to improve the best Microsoft Data Platform conference!

Power BI Performance Tuning

Eugene Meildinger (Speaking at SQL Saturday Austin) just released his new class. You can watch the first module for free and get the complete course for $100.

Free Azure and SQL Server Training in Austin, Texas!

On Saturday, March 9, 2024, SQL Saturday will be coming to Austin, Texas. This is a free training day around SQL Server, Azure, and the Microsoft Data Platform. If you would like lunch to be provided, it’s $20. We will also have two all-day deep dive training classes on performance tuning and Microsoft Analytics on Friday, March 8, 2024, for $125.

Need a Data Architect?

Have you got questions? Need some help? Are you curious to know the cost of procuring a Remote Data Architect?

In five minutes, Justin gives you a Microsoft Fabric overview to help you ingest, process, store, monitor, and govern your data with Microsoft’s new analytics tool. He will discuss user personas like Data Engineering and Data Science within Microsoft Fabric and how the personas will use tools like Data Activator, Power BI, Data Factory, Real-Time Analytics, Notebooks, Spark, Lakehouse, and more.

In five minutes, Justin at Procure SQL breaks down how Microsoft Fabric tools help you ingest, process, store, monitor, and govern your data with Microsoft’s new analytics tool.

Got Microsoft Fabric or Analytics Questions?

We would love to discuss your questions from watching this Microsoft Fabric overview video or anything related to the Microsoft Data Platform. Fill out the form below, and we will be in touch soon.