Procure SQL was at the Kansas City Developers Conference to help people procure the right Data Architecture partner.
John and Kon at Kansas City Developer Conference

Why Did You Sponsor Kansas City Developers Conference?

This is a great questions which leads to a good story. Back in 2012 Jeff Strass and Michael Eaton had a half day workshop on Going Independent. It was this workshop which helped push John to start Procure SQL. Its an honor to sponsor this event and it will always have a special place in the owners heart.

Kansas City Developers need help with your data?

If you need any help with SQL, NOSQL, Modern Data Warehousing, Reporting we would love to chat with you.

Database Management Myths for Developers

Today John & Kon gave our Database Management Myths for Developers talk. You can find the slides below.

“Every business workflow in every enterprise will be engineered with GenAI at its core” -ServiceNow’s Bill McDermott

Microsoft Build 2024 focused on transformative advancements in AI, cloud computing, and developer tools. This year’s event showcased Microsoft’s commitment to pushing the boundaries of what’s possible.

AI and Copilots were the overwhelming theme. Even though AI has hit mainstream for a while now, jumping on board now would still make you an early adopter and could give you some advantages within your market. With that being said, let’s delve into the key announcements and their implications for the data and application development space.

Copilots

Teams Copilot was introduced as a powerful enhancement for Microsoft Teams, designed to revolutionize the way teams collaborate. Leveraging advanced AI capabilities, Teams Copilot assists users by summarizing conversations, generating meeting agendas, and even drafting responses during discussions. This intelligent assistant integrates seamlessly within Teams, helping to streamline communication and enhance productivity by reducing the time spent on administrative tasks. With Teams Copilot, organizations can ensure that their teams are more focused on strategic initiatives, ultimately driving better outcomes and staying ahead of competitors​

You can now create and deploy custom AI agents with ease within Copilot Studio. Copilot Studio offers a robust set of tools for building intelligent agents that can automate complex tasks, streamline workflows, and enhance productivity. With the new agent capabilities, developers can design agents to interact seamlessly with various applications, providing users with context-aware assistance and real-time insights. These AI agents leverage advanced machine learning models and natural language processing to understand and respond to user inputs effectively. This allows businesses to create tailored solutions that can handle customer inquiries, manage routine tasks, and provide valuable data-driven insights, all while maintaining high levels of accuracy and efficiency.

Real-Time Intelligence in Microsoft Fabric

One of the most groundbreaking announcements was the introduction of Real-Time Intelligence within Microsoft Fabric. This end-to-end SaaS solution enables businesses to process high-volume, time-sensitive data at the point of ingestion, facilitating faster and more informed decision-making. Real-Time Intelligence is designed to support both low-code and code-rich experiences, making it a versatile tool for analysts and developers alike. For our data analytics team, this means we can build more responsive analytics solutions that provide immediate insights, enhancing our ability to drive strategic decisions based on your real-time data​.

Enhancements in GitHub Copilot
GitHub Copilot, already a game-changer for developers, received significant upgrades with the introduction of new extensions. These extensions, developed by Microsoft and third-party partners, integrate seamlessly with services like Azure, Docker, and Sentry. For our custom app development projects, this means we can leverage natural language capabilities within Copilot to manage Azure resources, troubleshoot issues, and streamline our development workflows. This integration not only boosts productivity but also enhances the efficiency of our development processes​.

Advances in Azure AI

Azure AI continues to evolve with the availability of GPT-4o, a multimodal AI model capable of processing text, images, and audio. Additionally, Microsoft introduced Phi-3-vision, a new model in the Phi-3 family, which is optimized for personal devices and offers powerful capabilities for text and image input. These models are accessible through Azure AI Studio, providing us with advanced tools to experiment and build innovative AI solutions. For our Data Analytics projects, these models can offer new ways to interact with and analyze data, enabling us to identify patterns and gain insights that can help us stay ahead of competitors. By leveraging these advanced AI tools, you can uncover hidden trends, make more informed decisions, and ultimately drive a greater strategic advantage against your competitors.

Smart Components

These components represent a significant leap forward in streamlining UI development within the .NET ecosystem. Smart Components are designed to automatically adapt to varying contexts and states, reducing the need for boilerplate code and extensive conditional logic. By leveraging advanced AI and machine learning, Smart Components can intelligently adjust their behavior and appearance based on real-time data and user interactions. This innovation simplifies the development process, enabling developers to create more dynamic and responsive applications with less effort. Smart Components can be particularly beneficial for building complex interfaces where different parts of the application need to interact seamlessly. They also enhance maintainability and scalability, as developers can rely on these components to handle many of the intricacies involved in state management and UI rendering.

Honorable Mentions

Additionally, .NET 9 Preview 4 was released, offering a glimpse into the future of the platform with numerous performance improvements, enhanced security features, and expanded support for cloud-native applications.

The announcement of C# 13 brought a host of new features aimed at making the language more expressive and user-friendly. Notable enhancements include improvements in pattern matching, interpolated string handlers, and extended lambda expressions, all designed to simplify coding and increase developer efficiency.

All of these advancements collectively underscore Microsoft’s dedication to evolving the data and .NET ecosystem, making it an even more robust and efficient environment for developers to build robust cutting-edge data solutions​.

WebNN (Web Neural Network API) was highlighted as a cutting-edge technology designed to bring advanced machine learning capabilities directly to web applications. WebNN allows developers to run neural network models efficiently in the browser, enabling real-time AI-powered experiences without relying heavily on server-side processing.

Conclusion

These features promise to revolutionize the way we interact with data, build applications, and drive business success. Staying up-to-date on all of these developments is crucial for any company aiming to maintain a competitive edge in today’s fast-paced digital landscape.

At Procure SQL, we are dedicated to helping businesses harness these cutting-edge technologies. Whether you need to integrate AI capabilities, enhance your data analytics, or develop custom applications using .NET, our expertise can guide you through the process. Let us assist you in leveraging these new and upcoming features to stay ahead of the game and achieve your strategic objectives. Contact us to learn more about how we can support your data journey.

Corruption.  We know it is everywhere.  It is surely a hot-button issue in the news.  If you haven’t given much thought to database integrity, now is the time to sit up and pay attention.  Corruption can occur at any time.  Most of the time database corruption is caused by a hardware issue.  No matter the reason, being proactive on database integrity will ensure your spot as a hero DBA in the face of corruption.

“A single lie destroys a whole reputation of integrity” – Baltasar Gracian

Integrity is one of those words people throw around quite often these days.  The definition of ‘integrity’ is the quality of being honest and having strong moral principles.  How can data have strong moral principles?  Does ‘data Integrity’ mean something different?  Yes, data integrity refers to the accuracy and consistency of stored data. 

Have Backups, Will Travel

When was the last time an integrity check was run on your database?  If you are sitting there scratching your brain trying to find the answer to that question, you may have serious issues with your database and not know it.

“But, I have solid backup plans in place, so this means I am okay.  Right?”

While having a solid backup and recovery plan in place is an absolute must, you may just have solid backups of corrupt data.  Regular integrity checks will test the allocation and structural integrity of the objects in the database.  This can test a single database, multiple databases (does not determine the consistency of one database to another), and even database indexes.  Integrity checks are very important to the health of your database and can be automated.  It is suggested to run the integrity check as often as your full backups are run. 

As discussed in an earlier blog, Validating SQL Server Backups, your data validation needs to take place BEFORE the backups are taken.  A best practice is to run a DBCC CHECKDB on your data to check for potential corruption.  Running CHECKDB regularly against your production databases will detect corruption quickly.  Thus providing a better chance to recover valid data from a backup, or being able to repair the corruption. CHECKDB will check the logical and physical integrity of the database by running these three primary checks*:

  • CHECKALLOC – checks the consistency of the database;
  • CHECKTABLE – checks the pages and structures of the table or indexed view; and
  • CHECKCATALOG – checks catalog consistency. 

Where to Look

Wondering if you have missing integrity checks or if they have ever been performed on your database?  The following T-SQL script will show when/if integrity checks were performed on your databases.  (Bonus) Running this script regularly will help track down missing integrity checks.

If you are looking for the last date the DBCC checks ran, the T-SQL script to use is as follows:


IF OBJECT_ID('tempdb..#DBCCs') IS NOT NULL
DROP TABLE #DBCCs;
CREATE TABLE #DBCCs
(
ID INT IDENTITY(1, 1)
PRIMARY KEY ,
ParentObject VARCHAR(255) ,
Object VARCHAR(255) ,
Field VARCHAR(255) ,
Value VARCHAR(255) ,
DbName NVARCHAR(128) NULL
)

/*Check for the last good DBCC CHECKDB date */
BEGIN
EXEC sp_MSforeachdb N'USE [?];
INSERT #DBCCs
(ParentObject,
Object,
Field,
Value)
EXEC (''DBCC DBInfo() With TableResults, NO_INFOMSGS'');
UPDATE #DBCCs SET DbName = N''?'' WHERE DbName IS NULL;';

WITH DB2
AS ( SELECT DISTINCT
Field ,
Value ,
DbName
FROM #DBCCs
WHERE Field = 'dbi_dbccLastKnownGood'
)
SELECT @@servername AS Instance ,
DB2.DbName AS DatabaseName ,
CONVERT(DATETIME, DB2.Value, 121) AS DateOfIntegrityCheck
FROM DB2
WHERE DB2.DbName NOT IN ( 'tempdb' )
END

The result will look similar to this.  However, let’s hope your results show a date closer to today’s date than my own!  If you see that your databases do not have integrity checks in place, check your backup and recovery plans and double check your agent jobs to see if perhaps the checks were scheduled but were turned off.

Exact Date Last Integrity Check

Recommendations

It is recommended that DBCC CHECKDB is run against all production databases on a regular schedule.  The best practice is to have this automated and scheduled as a SQL Agent job to run as a regular part of maintenance.  More specifically, to run the integrity check directly before purging any full backups.  Doing so will ensure that corruption is detected quickly, which will give you a much better chance to recover from your backup or being able to repair the corruption.

Remember, SQL Server is very forgiving and will back up a corrupt database!  Corrupt databases are recoverable, but they might have data pages that are totally worthless!

What is the best way to learn?

     The best way to learn is directly from industry professionals!  We are thrilled to be helping the SQLSaturday Wheeling organizers put together a great SQLSaturday, April 28, 2018.  The schedule is packed full of knowledgeable SQL professionals waiting to share their unique experiences and knowledge with you!

What is SQLSaturday?

     SQLSaturday is a free training event for Microsoft Data Platform professionals and those wanting to learn about SQL Server, Business Intelligence and Analytics. Admittance to this event is free, and most costs are covered by donations and sponsorships.  This event will be held on Apr 28 2018 at West Virginia Northern Community College, 1704 Market Street, Wheeling, West Virginia, 26003, United States.  You can register here: SQLSatWheeling

* To register for a SQLSaturday, you must first become a PASS member. A PASS account is free and makes it convenient to manage your SQLSaturday user profile, update preferences, and keep track of your SQLSaturday registrations.

What topics will be discussed?

     SQLSaturdays offer a wide variety of sessions from beginner to advanced.  There truly is something for every data professional at every level of his or her career.

  •  The topics covered are:
    • Analytics and Visualization
    • Application and Database Development
    • BI Information (Delivery, Architecture, Development, and Administration)
    • Cloud Development and Implementation
    • Enterprise Database Administration and Deployment
    • Professional Development

     The full schedule for SQLSaturday Wheeling can be found here.

What else does a SQLSaturday offer?

     Education is a very important element in growing your career.  Another often over-looked important element to growing a successful career is networking!  SQLSaturdays provide an excellent opportunity to meet new people, reconnect with old, and have a great time networking.  There will be a networking event Saturday after the event, so you will have a chance to unwind, grab a bite to eat, and chat with your newfound SQL friends!

     We will be there, and we hope to see you as well! Networking

Procure SQL is a proud sponsor of PASS SQL Saturday Houston, June 17, 2017! This is a free event for you, your employees, teammates or anyone interested in learning SQL Server for free. Make sure you sign up and come on down. 

Please stop by the booth and say hello to me.  I would love to meet you!  Mention that you read this blog and I will have a gift for you!  

I will have a prize wheel to spin for cool prizes, one of which is a hand spinner!  They are all the rage.  Have you ever played with one?  They are great for calming me during heavy database work!

We have the spinners...

We have the spinners…

Our founder, John Sterrett, will be speaking at 4pm.  Be sure to drop in and say hello before or after the session!

His session is titled Table Partitioning: the Secret Weapon for Your Big Data Problems.

"Is big data starting to slow you down? Is data growth putting your maintenance tasks in danger of not meeting your service level agreements? 
Do you wish you could archive old data with minimal impact to your tables during the archive process? 
Do you wish you could eliminate the majority of the data in your tables when you query them? If so, it's time you consider implementing 
table partitioning to help with your general performance and to reduce your maintenance window for completing your maintenance tasks. 

In this session, you will learn how table partitioning works and when it should be implemented. You will be able to understand how to 
initially configure partition functions and partition schemes to have a rock solid partitioned table. You learn how to implement new features 
added in SQL Server 2014 to improve table partitioning.  Finally, you will learn how to implement an automated sliding window to maintain 
your partitioned tables for ensuring optimal performance."