Antonios Chatzipavlis is a database architect and SQL Server expert with over 30 years of experience working with SQL Server. The document provides tips for installing and configuring SQL Server correctly, including selecting the appropriate server hardware, installing Windows, configuring disks and storage, installing and configuring SQL Server, and creating user databases. The goal is to optimize performance and reliability based on best practices.
This document discusses high availability and disaster recovery options in SQL Server 2012. It begins with an introduction and agenda. It then covers what's new in SQL Server 2012 including the new AlwaysOn Availability Groups feature. It discusses SQL Server failover clustering architecture and how it works. It also covers other high availability options like mirroring and log shipping. Finally, it demonstrates how to set up an availability group for high availability and disaster recovery.
SQL Server AlwaysOn for Dummies SQLSaturday #202 EditionMark Broadbent
Welcome to Microsoft's world of the buzzword. Yes, they've done it again and created another ambiguous term that no one really understands. AlwaysOn is a powerful group of highly available technologies, and in this presentation, we will delve into their murky world & reveal the technology behind the buzz. Focusing specifically on the two key components of SQL Server 2012 AlwaysOn in Failover Clustered Instances and Availability Groups, we will investigate their pre-requisites, setup, administration, use & drawbacks. We will look at: Using Windows 2008, 2012 and Server Core Windows Clustering Quorum Failover Clustered Instances Availability Groups Readable Secondaries Clustering Tools and PowerShell Dummies and higher are welcome.
Session delivered to SQL Relay 2015 in Nottingham.
In this session we look at some of the fundamental elements that you need to understand in order to build an IaaS solution that will meet the requirements to be covered by the Microsoft availability SLA. Additionally we will look at building a Microsoft Azure IaaS Solution.
Demo Code available at: http://1drv.ms/1PC8707
*Note if you want to use the demo code you will need a Microsoft Azure subscription.
AUDWC 2016 - Using SQL Server 20146 AlwaysOn Availability Groups for SharePoi...Michael Noel
SQL Server 2016 provides for unprecedented high availability and disaster recovery options for SharePoint farms in the form of AlwaysOn Availability Groups. Using this new technology, SharePoint architects can provide for near-instant failover at the data tier, without the risk of any data loss. In addition, the latest version of this technology, available with SQL Server 2016, allows for replicas of SharePoint databases to be stored in the cloud in Microsoft’s Azure cloud offering. This technology, which will be demonstrated live, completely changes the data tier design options for SharePoint and revolutionises high availability options for a farm. This session covers in step-by-step detail the exact configuration required to enable this functionality for a SharePoint 2013 farm, based on the best practices, tips and tricks, and real-world experience of the presenter in deploying this technology in production.
Understand the differences between SQL AlwaysOn options, and determine the requirements to deploy the technologies
Examine how SQL Server 2016 AlwaysOn Availability Groups can provide aggressive Service Level Agreements (SLAs) with a Recovery Point Objective (RPO) of zero and a Recovery Time Objective (RTO) of a few seconds.
See the exact steps required to enable SQL Server 2016 AlwaysOn Availability Groups for a SharePoint 2013 On-Premises environment, including options for storing replicas in Microsoft’s Azure cloud service.
- SQL Server 2016 enhances AlwaysOn availability groups by allowing up to two additional secondary replicas for failover purposes, improving high availability.
- It introduces load-balanced read-only replicas that can distribute read-only workloads across multiple secondaries in a round-robin fashion.
- Distributed transaction support is now provided for AlwaysOn when using Windows Server 2016 and SQL Server 2016.
Sql server 2012 - always on deep dive - bob duffyAnuradha
The document provides an overview of a presentation by Bob Duffy on SQL Server 2012 Always On. It outlines Bob Duffy's background and experience, the agenda for the presentation which includes topics like typical high availability and disaster recovery requirements, installing and migrating to Always On availability groups, planned and automated failover, active secondary replicas, and integration with failover clustering. It also includes a case study on requirements for a fictional company and describes typical high availability architectures.
End-to-end Troubleshooting Checklist for Microsoft SQL ServerKevin Kline
Learning how to detect, diagnose and resolve performance problems in SQL Server is tough. Often, years are spent learning how to use the tools and techniques that help you detect when a problem is occurring, diagnose the root-cause of the problem, and then resolve the problem.
In this session, attendees will see demonstrations of the tools and techniques which make difficult troubleshooting scenarios much faster and easier, including:
• XEvents, Profiler/Traces, and PerfMon
• Using Dynamic Management Views (DMVs)
• Advanced Diagnostics Using Wait Stats
• Reading SQL Server execution plan
Every DBA needs to know how to keep their SQL Server in tip-top condition, and you’ll need skills the covered in this session to do it.
This document provides an overview of SQL Server clustering. It discusses the importance of high availability and introduces some key concepts in clustering like nodes, shared storage, heartbeats, failover and failback. It also covers the basic architecture of a SQL Server cluster, including the virtual server and different types of clusters. Some advantages and disadvantages of clustering are outlined. Finally, it discusses some terminology used in clustering and provides a checklist for preparing Windows clustering.
This document discusses SQL Server 2012 AlwaysOn, a high availability and disaster recovery solution. It provides an overview of AlwaysOn availability groups, which allow for multiple synchronous or asynchronous copies of databases across instances. Key features include readable secondary replicas, automatic instance and database failover, and the ability to perform backups on secondary replicas. The document also demonstrates AlwaysOn configuration and functionality through a virtual machine-based lab environment.
Presenter: Dean Richards of Confio Software
If you're a developer or DBA, this presentation will outline a method for determining the best execution plan for a query every time by utilizing SQL Diagramming techniques.
Whether you're a beginner or expert, this approach will save you countless hours tuning a query.
You Will Learn:
* SQL Tuning Methodology
* Response Time Tuning Practices
* How to use SQL Diagramming techniques to tune SQL statements
* How to read executions plans
2AM. We sleeping well. And our mobile ringing and ringing. Message: DISASTER! In this session (on slides) we are NOT talk about potential disaster (such BCM); we talk about: What happened NOW? Which tasks should have been finished BEFORE. Is virtual or physical SQL matter? We talk about systems, databases, peoples, encryption, passwords, certificates and users. In this session (on few demos) I'll show which part of our SQL Server Environment are critical and how to be prepared to disaster. In some documents I'll show You how to be BEST prepared.
Configurando Aplicaciones para Réplicas de Lectura de SQL-Server AlwaysOn - C...SpanishPASSVC
This document announces an upcoming webinar on September 24th about configuring applications to take advantage of SQL Server AlwaysOn readable secondary replicas. It provides background on SQL Server high availability and disaster recovery technologies, an overview of AlwaysOn availability groups, and how to configure applications and use active secondary replicas for read scaling and backup operations. The webinar will include a demonstration of implementing an AlwaysOn availability group and testing readable secondaries.
Microsoft released SQL Azure more than two years ago - that's enough time for testing (I hope!). So, are you ready to move your data to the Cloud? If you’re considering a business (i.e. a production environment) in the Cloud, you need to think about methods for backing up your data, a backup plan for your data and, eventually, restoring with Red Gate Cloud Services (and not only). In this session, you’ll see the differences, functionality, restrictions, and opportunities in SQL Azure and On-Premise SQL Server 2008/2008 R2/2012. We’ll consider topics such as how to be prepared for backup and restore, and which parts of a cloud environment are most important: keys, triggers, indexes, prices, security, service level agreements, etc.
Once the ‘Backup Database’ command executed, SQL Server automatically does few ‘Checkpoint’ to reduce the recovery time and also it makes sure that at point of command execution there is no dirty pages in the buffer pool. After that SQL Server creates at least three workers as ‘Controller’, ‘Stream Reader’ and ‘Stream Writer’ to read and buffer the data asynchronously into the buffer area (Out of buffer pool) and write the buffers into the backup device.
High Availability & Disaster Recovery with SQL Server 2012 AlwaysOn Availabil...turgaysahtiyan
The AlwaysOn Availability Groups feature is a high-availability and disaster-recovery solution that provides an enterprise-level alternative to database mirroring. Introduced in SQL Server 2012, AlwaysOn Availability Groups maximizes the availability of a set of user databases for an enterprise. In this session we will talk about what’s coming with Always On, and how does it help to improve high availability and disaster recovery solutions.
PLSSUG - Troubleshoot SQL Server performance problems like a Microsoft EngineerMarek Maśko
This is yet another session of the SQL Server performance troubleshooting category. But this time it is not focused on various techniques and methodologies, what is the case in many others presentations. On the contrary. This presentation is focused on tools which are available for a very long time. The tools which are every day used by Microsoft engineers. The tools which still are known by very small amount of people.
The document discusses best practices for preparing for and responding to a disaster involving IT systems. It emphasizes the importance of having backups, restoration procedures, clearly defined roles and responsibilities, service level agreements, and all necessary information and equipment required to recover systems. Specific recommendations include developing backup policies, regularly testing restores, maintaining a separate test environment, and ensuring management understands estimated recovery times. The document stresses the importance of preparation and having the right people, processes, and documentation in place to minimize downtime from an outage or disaster.
The document introduces Diagnostic Management Views (DMVs) and Dynamic Management Functions (DMFs) in SQL Server. It discusses that DMVs and DMFs return server state information and can be used to monitor server health, diagnose problems, and tune performance. It provides examples of common DMVs and DMFs used for query execution and the query plan cache. Finally, it notes that the presentation will demonstrate troubleshooting with DMVs and DMFs.
SQL 2012 AlwaysOn Availability Groups for SharePoint 2013 - SharePoint Connec...Michael Noel
Using SQL Server 2012 AlwaysOn Availability Groups allows for high availability and disaster recovery of SharePoint 2013 farms. It provides zero data loss failover between nodes and readable secondary replicas. The document outlines the requirements and provides a step-by-step guide to implementing AlwaysOn Availability Groups for a SharePoint farm, including creating an availability group, adding databases, and creating an availability group listener.
This document provides information about a webinar on SQL Server 2016 Stretch Database presented by Antonios Chatzipavlis. The webinar covers an introduction to Stretch Database, its limitations and pricing, backup and restore of Stretch databases, and frequently asked questions. Antonios Chatzipavlis has over 30 years of experience working with computers and SQL Server. He is a Microsoft Certified Trainer and SQL Server Evangelist who runs the SQL School Greece training organization.
Department Row Level Security Customization For People Soft General Ledger.Pptwonga6
The document summarizes the implementation of department row-level security customization within the General Ledger module at the University of Calgary to comply with privacy laws. The customization restricts access to ledger and journal line records by department for each user. It was implemented using custom tables to associate users to departments, modified people code, and query security. The customization was later expanded to include new security roles and the ability to grant access without a department. The customization was needed due to privacy laws, standardized chart of accounts, and cultural factors around budget oversight and single person departments.
This document provides an overview of auditing data access in SQL Server. It discusses various methods for auditing such as using common criteria, SQL Trace, DML triggers, temporal tables, and implementing SQL Server Audit. SQL Server Audit is described as the primary auditing tool in SQL Server that can track both server and database level events. Considerations for implementing and managing SQL Server Audit are also covered.
Row Level Security (RLS) enables implementation of row-level access restrictions in SQL Server. RLS uses predicate functions to define the security logic and filters rows for queries based on that logic. Security predicates bind the predicate functions to tables and are defined as filter predicates to silently filter rows or blocking predicates to prevent write operations. Best practices include keeping the security logic simple and on separate schemas for maintenance. RLS has some limitations including incompatibility with Filestream and Polybase.
Implementing Mobile Reports in SQL Sserver 2016 Reporting ServicesAntonios Chatzipavlis
The document provides an overview of implementing mobile reports in SQL Server 2016 Reporting Services. It discusses preparing data for mobile reports, using the SQL Server Mobile Report Publisher tool, and publishing mobile reports. The presenter has extensive experience with SQL Server and provides their qualifications. The presentation also provides information on optimizing reports, formatting time data, using filters and Excel files in reports, and designing reports using navigators and visualizations in the Mobile Report Publisher tool. It demonstrates the tool's interface and capabilities.
Dynamic data masking is a data protection feature in SQL Server 2016 that masks sensitive data in query results without altering the actual data. It can help protect private information by exposing only obfuscated data to unauthorized users. Administrators can configure masking rules for specific columns using various masking functions like default, email, random, or custom string masking. The underlying data remains intact but masked data is returned for users without unmask permissions. It provides data security with minimal performance impact by masking results on-the-fly.
Live Query Statistics and Query Store are new features in SQL Server 2016 that provide insights into query performance. Live Query Statistics allows users to view live execution plans and operator statistics to troubleshoot long-running or problematic queries. Query Store automatically captures query histories, plans, and runtime statistics to help users identify performance regressions and force previous high-performing plans. Both features aim to simplify performance troubleshooting and provide greater visibility into the query optimization and execution process.
The document discusses building a data warehouse in SQL Server. It provides an agenda that covers topics like an overview of data warehousing, data warehouse design, dimension and fact tables, and physical design. It also discusses components of a data warehousing solution like the data warehouse database, ETL processes, and security considerations.
SQL Server 2016 introduces several new features for In-Memory OLTP including support for up to 2 TB of user data in memory, system-versioned tables, row-level security, and Transparent Data Encryption. The in-memory processing has also been updated to support more T-SQL functionality such as foreign keys, LOB data types, outer joins, and subqueries. The garbage collection process for removing unused memory has also been improved.
This document discusses automating big data analytics processes. It describes the traditional "old school" approach of manually extracting, loading and transforming raw data. The document then presents two "new school" examples that automate these processes. The first automates extracting core website and social media data, enriching email addresses, and generating segments and metrics. The second integrates data from 70+ APIs in real-time, performs custom aggregations, and enables behavioral segmentation and messaging. The document concludes by soliciting questions and feedback on working with big data.
The document provides an overview of Data Quality Services (DQS) and Master Data Services (MDS) in SQL Server 2016. It discusses the key components and features of DQS for cleansing and matching data, and MDS for defining master data structures and maintaining master data. The document also outlines the agenda for a presentation on DQS and MDS, including demos of using DQS to cleanse data and MDS to create models, entities, and load data.
This document provides an introduction and overview of machine learning concepts and Azure Machine Learning. It defines machine learning as finding patterns in data and using those patterns to predict the future. It outlines the machine learning workflow and lifecycle, including preparing data, applying algorithms to find patterns, iterating to create the best model, and deploying the final model. It also describes machine learning concepts like supervised and unsupervised learning, and different problem types like regression, classification, and clustering. Finally, it discusses options for using Azure Machine Learning, including free and full-featured paid accounts, and demonstrates its use.
This document summarizes Seth Familian's presentation on working with big data. The presentation covers the context and definition of big data, useful tools for working with big data like Splunk, building dashboards from big data using techniques like scheduled searches and indices, inferring customer segments from big data using RFM scoring and crossing segments with categories, and final thoughts on data as a new interface and starting points for working with big data. The presentation contains examples and screenshots to illustrate the concepts discussed.
The document discusses designing teams and processes to adapt to changing needs. It recommends structuring teams so members can work within their competencies and across projects fluidly with clear roles and expectations. The design process should support the team and their work, and be flexible enough to change with team, organization, and project needs. An effective team culture builds an environment where members feel free to be themselves, voice opinions, and feel supported.
An immersive workshop at General Assembly, SF. I typically teach this workshop at General Assembly, San Francisco. To see a list of my upcoming classes, visit https://generalassemb.ly/instructors/seth-familian/4813
I also teach this workshop as a private lunch-and-learn or half-day immersive session for corporate clients. To learn more about pricing and availability, please contact me at http://familian1.com
ECMDay2015 - Kent Agerlund – Configuration Manager 2012 – A Site ReviewKenny Buntinx
Ever experienced sluggish ConfigMgr administrator console performance or collections taking forever to refresh? Join Kent Agerlund as he will walk you thru a ConfigMgr site review and reveal why so many ConfigMgr installations don’t perform as they should. This sessions will be packed with tip and tricks, SQL secrets and PowerShell scripts that will optimize your environment and bring ConfigMgr into the state it was supposed to be from the beginning
Ashnik EnterpriseDB PostgreSQL - A real alternative to Oracle Ashnikbiz
A Technical introduction to PostgreSQL and Postgres Plus -
Enterprise Class PostgreSQL Database from EDB - You have a ‘Real’ alternative to Oracle and other conventional proprietary Databases
This document discusses common mistakes made with SQL Server and how to avoid them. It covers topics like backups, consistency checks, log cleanup, statistics maintenance, index maintenance, memory settings, parallelism settings, TempDB configuration, alerts, and power settings. The author is Tim Radney, a SQL Server MVP, who provides recommendations and scripts for ensuring databases are properly maintained and optimized.
This session introduces tools that can help you analyze and troubleshoot performance with SharePoint 2013. This sessions presents tools like perfmon, Fiddler, Visual Round Trip Analyzer, IIS LogParser, Developer Dashboard and of course we create Web and Load Tests in Visual Studio 2013.
At the end we also take a look at some of the tips and best practices to improve performance on SharePoint 2013.
Optimization SQL Server for Dynamics AX 2012 R3Juan Fabian
This document provides guidance on sizing and configuring SQL Server, the Application Object Server (AOS), Enterprise Portal, and other components for a Microsoft Dynamics AX implementation. It includes recommendations for hardware sizing based on transaction volumes and user counts. It also describes best practices for SQL Server configuration settings, indexing, statistics maintenance, and other tasks to ensure optimal performance of the Dynamics AX database and system.
Oracle database connection with the .net developersveerendramb3
Oracle Database 11g provides improved integration with Windows and .NET development. Key highlights include enhanced performance when running Oracle Database on Windows, easier development using Visual Studio tools, and unified management of Oracle and Microsoft servers.
Webinar slides: Our Guide to MySQL & MariaDB Performance TuningSeveralnines
If you’re asking yourself the following questions when it comes to optimally running your MySQL or MariaDB databases:
- How do I tune them to make best use of the hardware?
- How do I optimize the Operating System?
- How do I best configure MySQL or MariaDB for a specific database workload?
Then this replay is for you!
We discuss some of the settings that are most often tweaked and which can bring you significant improvement in the performance of your MySQL or MariaDB database. We also cover some of the variables which are frequently modified even though they should not.
Performance tuning is not easy, especially if you’re not an experienced DBA, but you can go a surprisingly long way with a few basic guidelines.
This webinar builds upon blog posts by Krzysztof from the ‘Become a MySQL DBA’ series.
AGENDA
- What to tune and why?
- Tuning process
- Operating system tuning
- Memory
- I/O performance
- MySQL configuration tuning
- Memory
- I/O performance
- Useful tools
- Do’s and do not’s of MySQL tuning
- Changes in MySQL 8.0
SPEAKER
Krzysztof Książek, Senior Support Engineer at Severalnines, is a MySQL DBA with experience managing complex database environments for companies like Zendesk, Chegg, Pinterest and Flipboard.
Technical Introduction to PostgreSQL and PPASAshnikbiz
Let's take a look at:
PostgreSQL and buzz it has created
Architecture
Oracle Compatibility
Performance Feature
Security Features
High Availability Features
DBA Tools
User Stories
What’s coming up in v9.3
How to start adopting
This document discusses migrating an Oracle Database Appliance (ODA) from a bare metal to a virtualized platform. It outlines the initial situation, desired target, challenges, and solution approach. The key challenges included system downtime during the migration, backup/restore processes, using external storage, and database reorganizations. The solution involved first converting to a virtual platform and then upgrading, using backup/restore, attaching an NGENSTOR Hurricane storage appliance for direct attached storage, and moving database reorganizations to a separate maintenance window. It also discusses the odaback-API tool created to help automate and standardize the migration process.
This document discusses MySQL performance tuning and various MySQL products and features. It provides information on MySQL 5.6 including improved scalability, new InnoDB features for NoSQL access, and an improved optimizer. It also discusses MySQL Enterprise Monitor for performance monitoring, and the Performance Schema for instrumentation and monitoring internal operations.
VMworld Europe 2014: Advanced SQL Server on vSphere Techniques and Best Pract...VMworld
This document provides an overview and agenda for a presentation on virtualizing SQL Server workloads on VMware vSphere. The presentation will cover designing SQL Server virtual machines for performance in production environments, consolidating multiple SQL Server workloads, and ensuring SQL Server availability using vSphere features. It emphasizes understanding the workload, optimizing for storage and network performance, avoiding swapping, using large memory pages, and accounting for NUMA when configuring SQL Server virtual machines.
The document discusses several high availability and disaster recovery options for SQL Server including failover clustering, database mirroring, log shipping, and replication. It provides examples of how different companies have implemented these technologies depending on their requirements. Key factors that influence architecture choices are downtime tolerance, deployment of technologies, and operational procedures. The document also covers SQL Server upgrade processes and how to move databases to a new datacenter while maintaining high availability.
Where to start? - the first 2 hours of performance troubleshooting
• The performance cheat sheet: cover all the basics before you start
• Data collections and mining the logs
• Common techniques to improve performance
Powering GIS Application with PostgreSQL and Postgres Plus Ashnikbiz
This document provides an overview of Postgres Plus Advanced Server and its features. It begins with introductions to PostgreSQL and PostGIS. It then discusses Postgres Plus Advanced Server's Oracle compatibility, performance enhancements, security features, high availability options, database administration tools, and migration toolkit. The document also provides information on scaling Postgres Plus Advanced Server through partitioning and infinite cache technologies. It concludes with summaries of the replication capabilities of Postgres Plus Advanced Server.
This document provides an overview of SQL Server from 2000 to 2014, highlighting new features over time like XML, management studio, mirroring, and AlwaysOn. It also summarizes key capabilities of SQL Server 2014 like in-memory processing across workloads, hybrid cloud optimization, and integration with HDInsight and Power BI. The document discusses drivers for in-memory OLTP like declining memory costs and increasing cores, and how it provides up to 10x performance gains through its integration with SQL Server.
Maaz Anjum - IOUG Collaborate 2013 - An Insight into Space Realization on ODA...Maaz Anjum
The document provides an overview of Maaz Anjum, a solutions architect specializing in Oracle products like OEM12c, Golden Gate, and Engineered Systems. It lists his email, blog, and experience using Oracle products since 2001. It also provides details about Bias Corporation, the company he works for, including its founding date, certifications, expertise, customers, and implementations.
Modernizing Your Database with SQL Server 2019 discusses SQL Server 2019 features that can help modernize a database, including:
- The Hybrid Buffer Pool which supports persistent memory to improve performance on read-heavy workloads.
- Memory-Optimized TempDB Metadata which stores TempDB metadata in memory-optimized tables to avoid certain blocking issues.
- Intelligent Query Processing features like Adaptive Query Processing, Batch Mode processing on rowstores, and Scalar UDF Inlining which improve query performance.
- Approximate Count Distinct, a new function that provides an estimated count of distinct values in a column faster than a precise count.
- Lightweight profiling, enabled by default, which provides query plan
Similar to Pre and post tips to installing sql server correctly (20)
This document provides an overview of using Polybase for data virtualization in SQL Server. It discusses installing and configuring Polybase, connecting external data sources like Azure Blob Storage and SQL Server, using Polybase DMVs for monitoring and troubleshooting, and techniques for optimizing performance like predicate pushdown and creating statistics on external tables. The presentation aims to explain how Polybase can be leveraged to virtually access and query external data using T-SQL without needing to know the physical data locations or move the data.
Antonios Chatzipavlis presented on SQL Server backup and restore. The presentation covered database architecture basics including data files, transaction log files, and the buffer cache. It also discussed backup types like full, differential, transaction log, copy only and partial backups. Backup strategies and restore processes were explained, including restoring to a point in time and restoring system databases. The internals of how SQL Server performs backups using buffers and I/O threads was also summarized.
Antonios Chatzipavlis presented on migrating SQL workloads to Azure. He discussed modernizing data platforms by discovering, assessing, planning, transforming, optimizing, testing and remediating. Key migration considerations include remaining, rehosting, refactoring, rearchitecting, rebuilding or replacing workloads. Tools for migrating data include Microsoft Assessment and Planning Toolkit, Data Migration Assistant, Database Experimentation Assistant, SQL Server Migration Assistant, and Azure Database Migration Service. Workloads can be migrated to Azure VMs, Azure SQL Databases or Azure SQL Managed Instances.
This document summarizes a webinar presentation about workload management in SQL Server 2019. It discusses how SQL Server's Resource Governor feature can be used to provide multitenancy, predictable performance, and isolation for multiple workloads running on a single SQL Server instance. Key concepts covered include resource pools, workload groups, and classification functions to assign sessions to different pools and groups. The presentation also reviews best practices for using lookup tables in classification functions and shows some DMVs for monitoring Resource Governor configuration and statistics.
This document provides an overview of loading data into Azure SQL DW (Synapse Analytics). It discusses extracting source data into text files, landing the data into Azure Data Lake Store Gen2, preparing the data for loading into staging tables using PolyBase or COPY commands, transforming the data, and inserting it into production tables. It also compares ETL vs ELT approaches and SSIS vs Azure Data Factory for data integration. The presenter then demonstrates loading data in Synapse SQL pool and invites any questions.
The document provides an overview of the DAX language. It discusses that DAX is the programming language used in Power BI, Power Pivot, and Analysis Services for data modeling, reporting, and analytics. It describes the basic components of a DAX data model including tables, columns, relationships, measures, and hierarchies. It also covers DAX syntax, functions, operators, and how context and filter context work in DAX calculations and queries.
This document summarizes common T-SQL anti-patterns that can negatively impact query performance, including using SELECT *, functions in predicates, OR operators, implicit conversions, unnecessary sorts, correlated subqueries, and dynamic SQL execution. The presentation provides explanations of why each anti-pattern hurts performance and recommendations for more optimized alternatives such as using indexes, temporary tables, parameterization, and execution plan analysis.
This document discusses designing a modern data warehouse in Azure. It provides an overview of traditional vs. self-service data warehouses and their limitations. It also outlines challenges with current data warehouses around timeliness, flexibility, quality and findability. The document then discusses why organizations need a modern data warehouse based on criteria like customer experience, quality assurance and operational efficiency. It covers various approaches to ingesting, storing, preparing and modeling data in Azure. Finally, it discusses architectures like the lambda architecture and common data models.
This document discusses designing a modern data warehouse in Azure. It provides an overview of traditional vs. self-service data warehouses and their limitations. It also outlines challenges with current data warehouses around timeliness, flexibility, quality and findability. The document then discusses why organizations need a modern data warehouse based on criteria like customer experience, quality assurance and operational efficiency. It covers various approaches to ingesting, storing, preparing, modeling and serving data on Azure. Finally, it discusses architectures like the lambda architecture and common data models.
The document provides details about an SQL expert's background and certifications. It summarizes the expert's career starting in 1982 working with computers and 1988 starting in the computer industry. In 1996, they started working with SQL Server 6.0 and have since earned multiple Microsoft certifications. The expert now provides training and consultation services, and created an online school called SQL School Greece to teach SQL Server.
Azure SQL Database for the SQL Server DBA - Azure Bootcamp Athens 2018 Antonios Chatzipavlis
Azure SQL Database is a managed database service hosted in Microsoft's Azure cloud. Some key differences from SQL Server include: the service is paid by the hour based on the selected service tier; users can dynamically scale resources up or down; backups and high availability are managed by the service provider; and common administration tasks are handled by the provider rather than the user. The service offers automatic backups, point-in-time restore, and geo-restore capabilities along with built-in high availability through replication across three copies in the primary region.
The document discusses technologies within the Microsoft SQL family and Azure SQL that can help organizations address requirements of the General Data Protection Regulation (GDPR). It covers features for discovering and classifying personal data, managing access and controlling how data is used, and protecting data through encryption, auditing and other security controls. Built-in technologies like dynamic data masking, row-level security, authentication options, and transparent data encryption are described as ways SQL Server and Azure SQL Database can help organizations comply with GDPR.
The document provides biographical information about Antonios Chatzipavlis, a SQL Server expert and evangelist. It then summarizes his presentation on statistics and index internals in SQL Server, which covers topics like cardinality estimation, inspecting and updating statistics, index structure and types, and identifying missing indexes. The presentation includes demonstrations of analyzing cardinality estimation and picking the right index key.
This document provides an introduction and overview of Azure Data Lake. It describes Azure Data Lake as a single store of all data ranging from raw to processed that can be used for reporting, analytics and machine learning. It discusses key Azure Data Lake components like Data Lake Store, Data Lake Analytics, HDInsight and the U-SQL language. It compares Data Lakes to data warehouses and explains how Azure Data Lake Store, Analytics and U-SQL process and transform data at scale.
This document provides an overview of Azure SQL Data Warehouse. It discusses what Azure SQL Data Warehouse is, how it is provisioned and scaled, best practices for designing tables in Azure SQL DW including distribution keys and data types, and methods for loading and querying data including PolyBase and labeling queries for monitoring. The presentation also covers tuning aspects like statistics, indexing, and resource classes.
This document provides an introduction and overview of Azure DocumentDB. It discusses how DocumentDB is a fully managed NoSQL database service that provides fast and predictable performance for JSON data through SQL querying capabilities. It also describes how DocumentDB offers features like elastic scaling, high availability, global distribution and ease of development. The document then provides information on starting with DocumentDB, writing queries, and programming capabilities within DocumentDB like stored procedures and triggers.
This document provides an introduction and background about the presenter along with information about SQL Database. The presenter has over 30,000 hours of training experience with SQL Server and various Microsoft certifications. They created SQL School Greece as a resource for IT professionals and others interested in SQL Server. The presentation will cover what SQL Database is on Azure, its service tiers including basic, standard, and premium, database transaction units (DTUs), the Azure SQL Database logical server, management tools for SQL Database, and securing SQL Database. It concludes with an invitation to sign up for SQL PASS and follow the presenter on social media.
SQL Server 2016 introduces new features for business intelligence and reporting. PolyBase allows querying data across SQL Server and Hadoop using T-SQL. Integration Services has improved support for AlwaysOn availability groups and incremental package deployment. Reporting Services adds HTML5 rendering, PowerPoint export, and the ability to pin report items to Power BI dashboards. Mobile Report Publisher enables developing and publishing mobile reports.
Choosing the Best Outlook OST to PST Converter: Key Features and Considerationswebbyacad software
When looking for a good software utility to convert Outlook OST files to PST format, it is important to find one that is easy to use and has useful features. WebbyAcad OST to PST Converter Tool is a great choice because it is simple to use for anyone, whether you are tech-savvy or not. It can smoothly change your files to PST while keeping all your data safe and secure. Plus, it can handle large amounts of data and convert multiple files at once, which can save you a lot of time. It even comes with 24*7 technical support assistance and a free trial, so you can try it out before making a decision. Whether you need to recover, move, or back up your data, Webbyacad OST to PST Converter is a reliable option that gives you all the support you need to manage your Outlook data effectively.
Keynote : Presentation on SASE TechnologyPriyanka Aash
Secure Access Service Edge (SASE) solutions are revolutionizing enterprise networks by integrating SD-WAN with comprehensive security services. Traditionally, enterprises managed multiple point solutions for network and security needs, leading to complexity and resource-intensive operations. SASE, as defined by Gartner, consolidates these functions into a unified cloud-based service, offering SD-WAN capabilities alongside advanced security features like secure web gateways, CASB, and remote browser isolation. This convergence not only simplifies management but also enhances security posture and application performance across global networks and cloud environments. Discover how adopting SASE can streamline operations and fortify your enterprise's digital transformation strategy.
"Making .NET Application Even Faster", Sergey Teplyakov.pptxFwdays
In this talk we're going to explore performance improvement lifecycle, starting with setting the performance goals, using profilers to figure out the bottle necks, making a fix and validating that the fix works by benchmarking it. The talk will be useful for novice and seasoned .NET developers and architects interested in making their application fast and understanding how things work under the hood.
Retrieval Augmented Generation Evaluation with RagasZilliz
Retrieval Augmented Generation (RAG) enhances chatbots by incorporating custom data in the prompt. Using large language models (LLMs) as judge has gained prominence in modern RAG systems. This talk will demo Ragas, an open-source automation tool for RAG evaluations. Christy will talk about and demo evaluating a RAG pipeline using Milvus and RAG metrics like context F1-score and answer correctness.
Finetuning GenAI For Hacking and DefendingPriyanka Aash
Generative AI, particularly through the lens of large language models (LLMs), represents a transformative leap in artificial intelligence. With advancements that have fundamentally altered our approach to AI, understanding and leveraging these technologies is crucial for innovators and practitioners alike. This comprehensive exploration delves into the intricacies of GenAI, from its foundational principles and historical evolution to its practical applications in security and beyond.
Redefining Cybersecurity with AI CapabilitiesPriyanka Aash
In this comprehensive overview of Cisco's latest innovations in cybersecurity, the focus is squarely on resilience and adaptation in the face of evolving threats. The discussion covers the imperative of tackling Mal information, the increasing sophistication of insider attacks, and the expanding attack surfaces in a hybrid work environment. Emphasizing a shift towards integrated platforms over fragmented tools, Cisco introduces its Security Cloud, designed to provide end-to-end visibility and robust protection across user interactions, cloud environments, and breaches. AI emerges as a pivotal tool, from enhancing user experiences to predicting and defending against cyber threats. The blog underscores Cisco's commitment to simplifying security stacks while ensuring efficacy and economic feasibility, making a compelling case for their platform approach in safeguarding digital landscapes.
DefCamp_2016_Chemerkin_Yury-publish.pdf - Presentation by Yury Chemerkin at DefCamp 2016 discussing mobile app vulnerabilities, data protection issues, and analysis of security levels across different types of mobile applications.
The Zaitechno Handheld Raman Spectrometer is a powerful and portable tool for rapid, non-destructive chemical analysis. It utilizes Raman spectroscopy, a technique that analyzes the vibrational fingerprint of molecules to identify their chemical composition. This handheld instrument allows for on-site analysis of materials, making it ideal for a variety of applications, including:
Material identification: Identify unknown materials, minerals, and contaminants.
Quality control: Ensure the quality and consistency of raw materials and finished products.
Pharmaceutical analysis: Verify the identity and purity of pharmaceutical compounds.
Food safety testing: Detect contaminants and adulterants in food products.
Field analysis: Analyze materials in the field, such as during environmental monitoring or forensic investigations.
The Zaitechno Handheld Raman Spectrometer is easy to use and features a user-friendly interface. It is compact and lightweight, making it ideal for field applications. With its rapid analysis capabilities, the Zaitechno Handheld Raman Spectrometer can help you improve efficiency and productivity in your research or quality control workflows.
Keynote : AI & Future Of Offensive SecurityPriyanka Aash
In the presentation, the focus is on the transformative impact of artificial intelligence (AI) in cybersecurity, particularly in the context of malware generation and adversarial attacks. AI promises to revolutionize the field by enabling scalable solutions to historically challenging problems such as continuous threat simulation, autonomous attack path generation, and the creation of sophisticated attack payloads. The discussions underscore how AI-powered tools like AI-based penetration testing can outpace traditional methods, enhancing security posture by efficiently identifying and mitigating vulnerabilities across complex attack surfaces. The use of AI in red teaming further amplifies these capabilities, allowing organizations to validate security controls effectively against diverse adversarial scenarios. These advancements not only streamline testing processes but also bolster defense strategies, ensuring readiness against evolving cyber threats.
Cracking AI Black Box - Strategies for Customer-centric Enterprise ExcellenceQuentin Reul
The democratization of Generative AI is ushering in a new era of innovation for enterprises. Discover how you can harness this powerful technology to deliver unparalleled customer value and securing a formidable competitive advantage in today's competitive market. In this session, you will learn how to:
- Identify high-impact customer needs with precision
- Harness the power of large language models to address specific customer needs effectively
- Implement AI responsibly to build trust and foster strong customer relationships
Whether you're at the early stages of your AI journey or looking to optimize existing initiatives, this session will provide you with actionable insights and strategies needed to leverage AI as a powerful catalyst for customer-driven enterprise success.
"Hands-on development experience using wasm Blazor", Furdak Vladyslav.pptxFwdays
I will share my personal experience of full-time development on wasm Blazor
What difficulties our team faced: life hacks with Blazor app routing, whether it is necessary to write JavaScript, which technology stack and architectural patterns we chose
What conclusions we made and what mistakes we committed
Pre and post tips to installing sql server correctly
1. th
EVENT Pre and Post tips to
Installing SQL Server
correctly
SQL Server 2008, 2008R2, 2012,2014Antonios Chatzipavlis
Database Architect • SQL Server Evangelist • Trainer
MCT, MCSE, MCITP, MCPD,MCSD,MCDBA,MCSA,MCTS, MCAD,MCP, OCA,ITIL-F
Jan 22, 2015
51
2. I have been started with computers.
I started my professional carrier in computers industry.
I have been started to work with SQL Server version 6.0
I earned my first certification at Microsoft as Microsoft Certified Solution Developer (3rd in
Greece) and started my carrier as Microsoft Certified Trainer (MCT) with more than 20.000
hours of training until now!
I became for first time Microsoft MVP on SQL Server
I created the SQL School Greece (www.sqlschool.gr)
I became MCT Regional Lead by Microsoft Learning Program.
I was certified as MCSE : Data Platform, MCSE: Business Intelligence
Antonios Chatzipavlis
Database Architect • SQL Server Evangelist • Trainer • Speaker
MCT, MCSE, MCITP, MCPD, MCSD, MCDBA, MCSA, MCTS, MCAD, MCP, OCA, ITIL-F
1982
1988
1996
1998
2010
2012
2013
3. Follow us in social media
Twitter @antoniosch / @sqlschool
Facebook fb/sqlschoolgr
YouTube yt/user/achatzipavlis
LinkedIn SQL School Greece group
Pinterest pi/SQLschool/
6. 51th AUTOEXEC.GR EVENT
SQL Server places different demands on its underlying hardware depending on what
type of database workload is running against the instance of SQL Server
• OLTP workloads
• OLAP workloads
You also have to keep in mind that very few database workloads are pure OLTP- or
pure DW-type workloads, so you will often have to deal with mixed workload types.
You also might have to host multiple databases on a single SQL Server instance,
where each database has a different type of workload
Understanding your Workload
7. 51th AUTOEXEC.GR EVENT
• Characterized by a high number
• of short-duration transactions
• and queries
that are usually executed on a single thread of execution.
• They can have a higher percentage of write activity
• The data in some tables can be extremely volatile.
• These characteristics have important implications for the
hardware selection and configuration process.
OLTP Workloads
8. 51th AUTOEXEC.GR EVENT
• Characterized by longer running queries against more static data.
• These queries are often parallelized by the query optimizer, so having a higher
number of physical cores in your processors can be very beneficial.
• Having a large amount of physical RAM is very useful for DW
workloads
• Because you will be able to have more data in the SQL Server buffer cache, which
will reduce the read pressure on the I/O subsystem.
• Tends to have very little write activity
• DW-type queries read large amounts of data as they calculate aggregates
• So good sequential read I/O performance is very important.
• Which will also affect how you configure your I/O subsystem in terms of storage
type and RAID level
OLAP Workloads
9. 51th AUTOEXEC.GR EVENT
• Server Processor Count Selection
• One common mistake is to assume that a “bigger” server in terms of
physical processor count is a faster server compared to a smaller server.
• Processor Vendor Selection
• Intel or AMD
• Processor Model Selection
• The performance of SQL Server is hugely dependent on the size of the L2
and L3 caches.
• Economizing on the L2 and L3 cache size is not usually an good choice.
Processor Selection
10. 51th AUTOEXEC.GR EVENT
• The basic rule of thumb for SQL Server is that
• You can never have too much memory
• Total number of memory slots
• More slots is better
• SMP vs NUMA
Memory Selection
12. 51th AUTOEXEC.GR EVENT
• Windows Server 2012 R2
• Highly recommended especially for server than need to be
highly available.
• Avoid Windows Server 2008 R2 and older versions
Choosing Windows OS
13. 51th AUTOEXEC.GR EVENT
• If you plan to use AlwaysOn AG it is important to apply the
following patches
• Windows Server 2008 R2 SP1 – KB2545685
• Windows Server 2012 – KB2784261
• Windows Server 2012 R2 – KB2920151
• Prerequisites, Restrictions, and Recommendations for AlwaysOn Availability
Groups
Apply patches and hotfixes on WinOS
14. 51th AUTOEXEC.GR EVENT
• SQL Server does not need a giant page file
• If SQL Server is the major service on the box a 2GB page file
on system drive it’s enough
• Beware removing the page file (KB254649)
Configure the Windows page file
15. 51th AUTOEXEC.GR EVENT
• Confirm that the Windows power plan is set to high
performance
• Confirm that the processors are running at full speed using
CPU-Z
Power Option
17. 51th AUTOEXEC.GR EVENT
• The servers is public on the Internet.
• The server have open ports to servers that are not behind a
firewall.
• The server read or execute files from other servers.
• The server run HTTP servers
• The server hosting file shares.
• The server use Database Mail to handle incoming or outgoing
email messages.
Use Anti-Virus when…
18. 51th AUTOEXEC.GR EVENT
• Directories of SQL Server instance
• SQL Server data files
• SQL Server backup files
• Full-Text catalog files
• Trace files
• SQL audit files
• SQL query files
• SQL Server service
Setting Anti-Virus exclusions
20. 51th AUTOEXEC.GR EVENT
• Minimum RAID 1 for all drives
• Including OS system drive
• Even SSD or PCI-Express
• RAID 10 for best performance
• Use 128GB drive for OS system drive
• Test the I/O performance SQLIO/SQLIOSIM
Use RAID
21. 51th AUTOEXEC.GR EVENT
• OS System drive should be formatted with the default (4K)
cluster size.
• All drives holding data & log files should be formatted with
64K cluster size
• Check your storage for partition alignment
• Follow this rule even if it’s a VM on shared storage
Disk Drive Format
22. 51th AUTOEXEC.GR EVENT
• SQL Server application folders
• SQL Server database data files
• SQL Server database log file
• including TempDB
• TempDB data files
• Backups
Use separate drive for
24. 51th AUTOEXEC.GR EVENT
• The connectivity with SQL Server is important!
• You have Failover clustering or Availability Groups
• It’s a good practice even for standalone server
• Teaming NICs
Use Multiple Physical Network Cards when
26. 51th AUTOEXEC.GR EVENT
• Use dedicated domain user account with no special rights on
the domain.
• You do not need or want this account to be a local admin on
the machine where SQL Server will be installed.
• Use a separate, dedicated domain user account for the SQL
Server Agent service.
• If you are going to be installing and using other SQL Server
related services, you will want dedicated domain accounts for
each service.
SQL Server services accounts
27. 51th AUTOEXEC.GR EVENT
• Enable Instant File Initialization
• Perform Volume Maintenance Tasks
• Grant Lock pages in memory
• Common on SQL Server 2005 / Windows 2003
• Less common with newer versions
(Still it is a good idea to enable LPIM on a new system)
• Add the permissions to the Service Account in AD - KB319723
• readServicePrincipalName
• writeServicePrincipalName
Policy Settings and Rights for the SQL Server service account
29. 51th AUTOEXEC.GR EVENT
• Install only the features you actually need
• This will reduce your attack surface
• It will speed future maintenance of the instance because there are fewer
components to patch
• Install Services Packs or CUs
• Enter a strong password for the sa account if you choose
Mixed Mode authentication
• Set the Data Directories according to plan
• Do not use C: drive
SQL Server Installation
31. 51th AUTOEXEC.GR EVENT
• 1118
• This trace flag switches allocations in tempdb from single-page at a time for
the first 8 pages, to immediately allocate an extent (8 pages).
• 2371
• that you can use to control when the query optimizer generates autostats
on a table
Trace Flags to enable
32. 51th AUTOEXEC.GR EVENT
• Enable compressed backups
• Setting Default backup media retention (days)
• Setting Database default location for
• Data files
• Log files
• Backups
Server Properties
33. 51th AUTOEXEC.GR EVENT
• Max Worker Threads
• Priority Boost
• Lightweight Pooling
• Maximum number of concurrent connections
• Network Packet Size
Server Properties
34. 51th AUTOEXEC.GR EVENT
• Set Max Server Memory
• Important when LPIM is enabled
• Use this formula to calculate SQL Server
Memory
• Reserve 1GB for OS
• Reserve 1GB for each 4GB after the first 4GB and until
16GB
• Reserve 1GB for each 8GB after the first 16GB
• Monitor the Memory:Available MB
performance counter
SQL Server Memory
Server Memory 64GB
1GB for OS
3GB for 4-16GB
6GB for 16-64GB
10GB in total
64-10 = 54 Max SQL
Server memory
35. 51th AUTOEXEC.GR EVENT
• Change the default size for data and log files
• Change the file growth to fixed units
Tweak Model database
36. 51th AUTOEXEC.GR EVENT
• Move TempDB to its own drive
• Grow the size of data file
• Add additional data files as the number of logical processors
up to 8 logical CPUs
• Each file must have the same size
• Pre-allocate the space.
• KB2154845
Configure TempDB
37. 51th AUTOEXEC.GR EVENT
• Set this to the number of physical cores in a single NUMA
node socket on your hardware or less
• Always use an even value
• Use the value of 1 only of you have specific vendor
requirements
• SharePoint
• BizTalk
• SAP
• KB2806353
Configure MAXDOP
38. 51th AUTOEXEC.GR EVENT
• General default value of 5 is low for most OLTP workloads
and should be increased.
• Base value of 20-25 used for most server installs.
Cost Threshold of Parallelism
39. 51th AUTOEXEC.GR EVENT
• Control the amount of memory that is used by single-use, ad
hoc query plans in the plan cache.
• SQL Server store only a small stub of an ad hoc query plan in
the plan cache the first time that the ad hoc plan is executed
• Reduces the memory required for that plan in the plan cache.
• It’s not a panacea for single-use ad hoc query plans
• http://www.sqlschool.gr/blog/do-you-have-optimize-for-ad-
hoc-workloads-on-sql-server-2008-r2-instances-380.aspx
Optimize for Ad-hoc workloads
40. 51th AUTOEXEC.GR EVENT
• Enable TCP/IP
• Firewall exceptions
• TCP port of instance
• UTP 1434 for SQL Browser
SQL Server Network Connectivity
41. 51th AUTOEXEC.GR EVENT
• Configure Database Mail
• Create Operators
• Configure SQL Agent to use Database Mail
• Create Alerts for Severity 16 to 25
• Create Alerts for Errors 823, 824, 825
• Adding Ola Hallengren’s Maintenance Solution
• Install Adam Machanic sp_WhoIsActive
• Install and run Brent Ozar sp_Blitz
Configure Alerting and Monitoring
43. 51th AUTOEXEC.GR EVENT
• Don’t use the default file size
• Don’t use presentence as file growth
• Pay attention on T-Log size and growth to produce equal
VLFs
• Use more than one filegroups
• In PRIMARY leave system object
• Put all user objects to another filegroup
• Use more than one data files
• Even this are in the same drive
Create Database
44. 51th AUTOEXEC.GR EVENT
• Don’t set Auto Close
• Don’t set Auto Shrink
• Don’t unset Auto Create/Update Statistics
Database Properties