The document discusses technologies within the Microsoft SQL family and Azure SQL that can help organizations address requirements of the General Data Protection Regulation (GDPR). It covers features for discovering and classifying personal data, managing access and controlling how data is used, and protecting data through encryption, auditing and other security controls. Built-in technologies like dynamic data masking, row-level security, authentication options, and transparent data encryption are described as ways SQL Server and Azure SQL Database can help organizations comply with GDPR.
This document provides a summary of Antonios Chatzipavlis's background and experience working with SQL Server. It details his career starting with SQL Server 6.0 in 1996 and earning his first Microsoft certification. It lists the various Microsoft certifications and roles he has held, including becoming an MVP for SQL Server. It also introduces his creation of SQL School Greece in 2012 to share his knowledge.
Azure SQL Database for the SQL Server DBA - Azure Bootcamp Athens 2018 Antonios Chatzipavlis
Azure SQL Database is a managed database service hosted in Microsoft's Azure cloud. Some key differences from SQL Server include: the service is paid by the hour based on the selected service tier; users can dynamically scale resources up or down; backups and high availability are managed by the service provider; and common administration tasks are handled by the provider rather than the user. The service offers automatic backups, point-in-time restore, and geo-restore capabilities along with built-in high availability through replication across three copies in the primary region.
SQL Server 2016 includes several new features such as columnstore indexes, in-memory OLTP, live query statistics, temporal tables, and row-level security. It also features improved manage backup functionality, support for multiple tempdb files, and new ways to format and encrypt query results. Advanced capabilities like PolyBase and Stretch Database further enhance analytics and management of historical data.
This document summarizes new features in SQL Server 2016. It discusses improvements to columnstore indexes, in-memory OLTP, the query store, temporal tables, always encrypted, stretch database, live query statistics, row level security, and dynamic data masking. It provides links to documentation and demos for these features. It also suggests what may be included in future CTP releases and lists resources for learning more about SQL Server 2016.
SQL Server 2016: Just a Few of Our DBA's Favorite ThingsHostway|HOSTING
Join Rodney Landrum, Senior DBA Consultant for Ntirety, a division of HOSTING, as he demonstrates his favorite new features of the latest Microsoft SQL Server 2016 Service Pack 1.
During the accompanying webinar and slides, Rodney will touch on the following:
• A demo of his favorite new features in SQL Server 2016 and SP1 including:
o Query Store
o Database Cloning
o Dynamic Data Masking
o Create or Alter
• A review of Enterprise features that are now available in standard edition
• New information in Dynamic Management Views and SQL Error Log that will make your DBAs job easier.
SQL Server 2016 introduces new editions that provide varying levels of capabilities for different workloads. The key editions are Express, Standard, and Enterprise. Express is free and ideal for small applications. Standard provides core data management and business intelligence. Enterprise delivers comprehensive datacenter capabilities for mission critical workloads and advanced analytics. All editions now support new security features and hybrid cloud capabilities like stretch database.
Azure SQL Database is a relational database-as-a-service hosted in the Azure cloud that reduces costs by eliminating the need to manage virtual machines, operating systems, or database software. It provides automatic backups, high availability through geo-replication, and the ability to scale performance by changing service tiers. Azure Cosmos DB is a globally distributed, multi-model database that supports automatic indexing, multiple data models via different APIs, and configurable consistency levels with strong performance guarantees. Azure Redis Cache uses the open-source Redis data structure store with managed caching instances in Azure for improved application performance.
This document discusses Live Query Statistics and the Query Store in Microsoft SQL Server 2016 for troubleshooting query performance issues. Live Query Statistics allows viewing execution plans and metrics of in-flight queries. The Query Store provides a dedicated store for query performance data, capturing plan histories and metrics to help identify regressed queries and other issues. Enabling these tools helps DBAs monitor performance and address issues like slow queries and plans impacted by upgrades or data changes.
Confoo 202 - MySQL Group Replication and ReplicaSetDave Stokes
MySQL Group Replication, ReplicaSet, & Architectures outlines MySQL's general product direction for high availability and replication. It provides an overview of Group Replication, ReplicaSet, and related components like MySQL Shell and MySQL Router. Key capabilities discussed include automated setup and management, integrated load balancing, and both asynchronous and synchronous replication options. Limitations noted include the requirement for manual failover in ReplicaSet deployments.
The document discusses MySQL Enterprise Edition and provides an agenda that includes: Why MySQL Enterprise?, Management Tools, Advanced Features, Technical Support & Certifications, and Case Studies. It then goes into further detail on each of these sections, providing information on the various tools, features, support offerings, and customer examples of MySQL Enterprise Edition.
Azure SQL Database Introduction by Tim RadneyHasan Savran
Have you been hearing about Azure Managed Instances and want to know what all the fuss is about? Come see how Managed Instances is changing how we think about cloud databases. Managed Instances can be considered a hybrid of Azure SQL Database and on-premises SQL Server with all the awesome benefits of Platform as a Service. You’ll get to see first-hand how easy it is to migrate databases from on-premises to a Managed Instance. We’ll explore the differences between Azure SQL Database, Managed Instances, and SQL Server on an Azure VM to help you determine what is the best fit for your organization. If you’ve been considering Azure for your organization, this session is for you!
The document discusses SQL Server monitoring and troubleshooting. It provides an overview of SQL Server monitoring, including why it is important and common monitoring tools. It also describes the SQL Server threading model, including threads, schedulers, states, the waiter list, and runnable queue. Methods for using wait statistics like the DMVs sys.dm_os_waiting_tasks and sys.dm_os_wait_stats are presented. Extended Events are introduced as an alternative to SQL Trace. The importance of establishing a performance baseline is also noted.
The document provides an overview and summary of new features in Microsoft SQL Server 2016. It discusses enhancements to the database engine, in-memory OLTP, columnstore indexes, R services, high availability, security, and Reporting Services. Key highlights include support for up to 2TB of durable memory-optimized tables, increased index key size limits, temporal data support, row-level security, and improved integration with Azure and Power BI capabilities. The presentation aims to help users understand and leverage the new and improved features in SQL Server 2016.
This document provides an overview of Always Encrypted in Microsoft SQL Server 2016, which allows customers to securely store sensitive data outside of their trust boundary while protecting data from highly privileged users. Key capabilities of Always Encrypted include client-side encryption of sensitive data using keys never provided to the database system and support for queries on encrypted data, with minimal application changes required.
SQL or NoSQL, is this the question? - George GrammatikosGeorge Grammatikos
This document provides an overview and comparison of SQL and NoSQL databases. It lists the most popular databases according to a Stack Overflow survey, including SQL databases like Azure SQL and NoSQL databases like Azure Cosmos DB. It then defines RDBMS and NoSQL databases and provides examples of relational and non-relational data models. The document compares features of SQL and NoSQL databases such as scalability, performance, data modeling flexibility and pricing. It also includes live demo instructions for provisioning Azure SQL and Cosmos DB databases.
SQL Server 2016 New Features and EnhancementsJohn Martin
SQL Server 2016 new features session that I delivered at SQL Relay 2015 at; Reading, London, Cardiff and Birmingham.
Looking at some of the new features currently slated for inclusion in the next version of Microsoft SQL Server 2016.
Demo Code can be found at: http://1drv.ms/1PC5smY
This document provides an overview of MySQL, including its architecture, components, features, and functionalities. MySQL is an open-source relational database management system (RDBMS) that is flexible, scalable, and free to download. It follows a client-server model and can handle large amounts of data. Key features include security, compatibility with many operating systems, and high performance even under demanding workloads.
A walkthrough on implementing Always Encrypted Encryption on sensitive information to reduce your attack surface area and develop an active data security posture.
Microsoft Cloud GDPR Compliance Options (SUGUK)Andy Talbot
The presentation provides an overview of GDPR and how organizations can accelerate compliance using Microsoft cloud services. It discusses the key changes introduced by GDPR including enhanced personal privacy rights, increased duty to protect data, mandatory breach reporting, and significant penalties for non-compliance. It then outlines how Microsoft can help organizations discover, manage, protect, and report personal data through solutions like Azure, Office 365, and Enterprise Mobility + Security.
The document discusses strategies for complying with the EU's General Data Protection Regulation (GDPR). It outlines five critical strategies: 1) Know all personal data stored, 2) Carefully manage access to personal data, 3) Encrypt as much data as possible, 4) Monitor changes affecting sensitive data and prevent critical changes, and 5) Investigate potential breaches. It also discusses how the software company Quest can help customers strengthen data protection, ensure compliance, and avoid fines through solutions that secure and manage data, modernize infrastructure, and provide insights.
Modern Data Security for the Enterprises – SQL Server & Azure SQL DatabaseWinWire Technologies Inc
The webinar talked about the layers of data protection, important security features, potential scenarios in which these features can be applied to limit exposure to security threats and best practices for securing business applications and data. We covered following topics on SQL Server 2016 and Azure SQL Database security features
• Access Level Control
• Data Encryption
• Monitoring
The document discusses Microsoft's offerings and expertise to help organizations achieve compliance with the General Data Protection Regulation (GDPR). The GDPR imposes new rules for handling personal data and increases penalties for non-compliance. Microsoft is committed to GDPR compliance across its cloud services and helping customers meet requirements related to privacy controls, security, and transparency. It provides solutions to help organizations discover, manage, protect, and report on personal data throughout the compliance process.
Breakdown of Microsoft Purview SolutionsDrew Madelung
Drew Madelung presented on Microsoft Purview solutions at 365EduCon Seattle 2023. Purview is a set of solutions that help organizations govern and protect data across multi-cloud environments while meeting compliance requirements. It brings together solutions for understanding data, safeguarding it wherever it lives, and improving risk and compliance posture. Madelung demonstrated Purview's capabilities for classification, information protection, insider risk management, data loss prevention, records management, eDiscovery, auditing, and more. He advocated adopting Purview to comprehensively govern data using an incremental crawl-walk-run strategy.
This document provides information on database security. It discusses how database security protects confidentiality, integrity and availability of databases. It also discusses the importance of database security to prevent data loss or compromise. Some of the largest data breaches in 2018 are summarized, including breaches of Aadhaar and Facebook that exposed over 1 billion and 87 million records respectively. Common attack vectors and frameworks for implementing database security are referenced. Finally, the document outlines a methodology for implementing proven database security practices around inventory, testing, compliance, eliminating vulnerabilities, enforcing least privileges, monitoring for anomalies, data protection, backup plans, and responding to incidents.
This document discusses Microsoft Cloud Deutschland and how it aims to provide a secure cloud solution for German customers that complies with German data protection laws. It begins with an introduction and overview of current privacy and security issues. It then discusses Microsoft Cloud Deutschland in more detail, describing its security features and certifications. It also discusses how Microsoft is preparing customers for the upcoming GDPR regulations through solutions in Azure, Azure AD, and Enterprise Mobility + Security.
The data services marketplace is enabled by a data abstraction layer that supports rapid development of operational applications and single data view portals. In this presentation yo will learn services-based reference architecture, modality, and latency of data access.
- Reference architecture for enterprise data services marketplace
- Modality and latency of data access
- Customer use cases and demo
This presentation is part of the Denodo Educational Seminar , and you can watch the video here goo.gl/vycYmZ.
The document discusses adopting a Zero Trust approach to IT security. It outlines some of the key principles of Zero Trust, including explicitly verifying identities rather than assuming trust, treating identities as the new perimeter, and basing access decisions on attributes like user, device, app, location, and risk. The document provides an overview of Microsoft's Zero Trust framework and reference architecture. It also shares a maturity model to help organizations assess their Zero Trust progress and prioritize next steps.
Dr. Wei Chen discusses database security. The three components of database security are confidentiality, integrity, and availability (CIA). Confidentiality involves protecting data from unauthorized disclosure through encryption and access controls. Integrity ensures data is not tampered with using hashing and signing. Availability ensures authorized users can access data when needed through backups and DDoS protection. Mobile database security poses additional challenges due to devices leaving secure networks. Encrypting sensitive data and using device authentication can help. Content providers allow sharing data between apps if necessary but increase security risks. Auditing, access controls, and input validation are important defenses against threats like SQL injection.
A robust and verifiable threshold multi authority access control system in pu...IJARIIT
Attribute-based Encryption is observed as a promising cryptographic leading tool to assurance data owners’ direct
regulator over their data in public cloud storage. The former ABE schemes include only one authority to maintain the whole
attribute set, which can carry a single-point bottleneck on both security and performance. Then, certain multi-authority
schemes are planned, in which numerous authorities distinctly maintain split attribute subsets. However, the single-point
bottleneck problem remains unsolved. In this survey paper, from another perspective, we conduct a threshold multi-authority
CP-ABE access control scheme for public cloud storage, named TMACS, in which multiple authorities jointly manage a
uniform attribute set. In TMACS, taking advantage of (t, n) threshold secret allocation, the master key can be shared among
multiple authorities, and a lawful user can generate his/her secret key by interacting with any t authorities. Security and
performance analysis results show that TMACS is not only verifiable secure when less than t authorities are compromised, but
also robust when no less than t authorities are alive in the system. Also, by efficiently combining the traditional multi-authority
scheme with TMACS, we construct a hybrid one, which satisfies the scenario of attributes coming from different authorities as
well as achieving security and system-level robustness.
The document provides an overview of AWS security presented by Max Ramsay. It discusses AWS security capabilities that are available to all customers regardless of business type. It focuses on case studies of how Serasa Experian and Trend Micro use AWS, highlighting benefits like agility, flexibility and cost reduction. The document also covers shared security responsibilities on AWS, compliance controls, network security features, and resources for learning more about AWS security best practices.
This document discusses security risks associated with cloud computing and databases. The main security risks are data breaches, data loss, and service hijacking that can occur when sensitive data is stored in cloud databases. Two examples of past data breaches at large companies, Home Depot and Target, are described along with the steps they took to strengthen security and regain customer trust. Methods to overcome security challenges in cloud computing discussed are encrypting data, implementing strong key management practices, and giving users control over their encryption keys.
In early 2019, Microsoft created the AZ-900 Microsoft Azure Fundamentals certification. This is a certification for all individuals, IT or non IT background, who want to further their careers and learn how to navigate the Azure cloud platform.
Learn about AZ-900 exam concepts and how to prepare and pass the exam
MongoDB.local Sydney: The Changing Face of Data Privacy & Ethics, and How Mon...MongoDB
Public concern for the safety of data is growing – not just in how criminals might use stolen data to commit fraud, but also in how personal data is used by the organisations we engage with. This is limiting growth in digital services, and damaging trust in government and enterprises.
The EU's General Data Protection Regulation (GDPR) came into force in May 2018. Now it is influencing new privacy regulations around the world, governing how organisations collect, store, process, retain, and share the personal data of citizens.
In this session, we explore the specific data management requirements demanded by new privacy regulations, digital ethics, and everyone's role in being conscientious stewards of customer data. We discuss how MongoDB can provide the core technology foundations to help you accelerate your path to compliance with new privacy demands.
Cloud Computing markedet er præget af både mange myter og mange forskellige definitioner – i sessionen gives et indblik i de overordnede elementer set fra Microsofts vinkel. Hvordan håndterer Microsoft cyberkriminalitet, beskyttelse af privacy, overholdelse af gældende lovgivning, internationale standarder etc. Der gives også gode råd om, hvad man bør overveje i sin vurdering af hvad Cloud evt kan hjælpe virksomheden med og ikke mindst hvordan man i mange tilfælde kan opnå et markant højere sikkerhedsniveau gennem implementeringen af rene eller hybride cloud løsninger.
The document summarizes key points from a presentation on cloud security standards. It discusses the benefits of standards in promoting interoperability and regulatory compliance. It analyzes the current landscape of standards, including specifications, advisory standards, and security frameworks. It also provides recommendations for 10 steps customers can take to evaluate a cloud provider's security, including ensuring governance and compliance, auditing processes, managing access controls, and assessing physical infrastructure security. The document recommends cloud security standards and certifications customers should expect providers to support.
This document provides an overview of using Polybase for data virtualization in SQL Server. It discusses installing and configuring Polybase, connecting external data sources like Azure Blob Storage and SQL Server, using Polybase DMVs for monitoring and troubleshooting, and techniques for optimizing performance like predicate pushdown and creating statistics on external tables. The presentation aims to explain how Polybase can be leveraged to virtually access and query external data using T-SQL without needing to know the physical data locations or move the data.
Antonios Chatzipavlis presented on SQL Server backup and restore. The presentation covered database architecture basics including data files, transaction log files, and the buffer cache. It also discussed backup types like full, differential, transaction log, copy only and partial backups. Backup strategies and restore processes were explained, including restoring to a point in time and restoring system databases. The internals of how SQL Server performs backups using buffers and I/O threads was also summarized.
Antonios Chatzipavlis presented on migrating SQL workloads to Azure. He discussed modernizing data platforms by discovering, assessing, planning, transforming, optimizing, testing and remediating. Key migration considerations include remaining, rehosting, refactoring, rearchitecting, rebuilding or replacing workloads. Tools for migrating data include Microsoft Assessment and Planning Toolkit, Data Migration Assistant, Database Experimentation Assistant, SQL Server Migration Assistant, and Azure Database Migration Service. Workloads can be migrated to Azure VMs, Azure SQL Databases or Azure SQL Managed Instances.
This document summarizes a webinar presentation about workload management in SQL Server 2019. It discusses how SQL Server's Resource Governor feature can be used to provide multitenancy, predictable performance, and isolation for multiple workloads running on a single SQL Server instance. Key concepts covered include resource pools, workload groups, and classification functions to assign sessions to different pools and groups. The presentation also reviews best practices for using lookup tables in classification functions and shows some DMVs for monitoring Resource Governor configuration and statistics.
This document provides an overview of loading data into Azure SQL DW (Synapse Analytics). It discusses extracting source data into text files, landing the data into Azure Data Lake Store Gen2, preparing the data for loading into staging tables using PolyBase or COPY commands, transforming the data, and inserting it into production tables. It also compares ETL vs ELT approaches and SSIS vs Azure Data Factory for data integration. The presenter then demonstrates loading data in Synapse SQL pool and invites any questions.
The document provides an overview of the DAX language. It discusses that DAX is the programming language used in Power BI, Power Pivot, and Analysis Services for data modeling, reporting, and analytics. It describes the basic components of a DAX data model including tables, columns, relationships, measures, and hierarchies. It also covers DAX syntax, functions, operators, and how context and filter context work in DAX calculations and queries.
The document introduces Diagnostic Management Views (DMVs) and Dynamic Management Functions (DMFs) in SQL Server. It discusses that DMVs and DMFs return server state information and can be used to monitor server health, diagnose problems, and tune performance. It provides examples of common DMVs and DMFs used for query execution and the query plan cache. Finally, it notes that the presentation will demonstrate troubleshooting with DMVs and DMFs.
This document summarizes common T-SQL anti-patterns that can negatively impact query performance, including using SELECT *, functions in predicates, OR operators, implicit conversions, unnecessary sorts, correlated subqueries, and dynamic SQL execution. The presentation provides explanations of why each anti-pattern hurts performance and recommendations for more optimized alternatives such as using indexes, temporary tables, parameterization, and execution plan analysis.
This document discusses designing a modern data warehouse in Azure. It provides an overview of traditional vs. self-service data warehouses and their limitations. It also outlines challenges with current data warehouses around timeliness, flexibility, quality and findability. The document then discusses why organizations need a modern data warehouse based on criteria like customer experience, quality assurance and operational efficiency. It covers various approaches to ingesting, storing, preparing and modeling data in Azure. Finally, it discusses architectures like the lambda architecture and common data models.
Modernizing Your Database with SQL Server 2019 discusses SQL Server 2019 features that can help modernize a database, including:
- The Hybrid Buffer Pool which supports persistent memory to improve performance on read-heavy workloads.
- Memory-Optimized TempDB Metadata which stores TempDB metadata in memory-optimized tables to avoid certain blocking issues.
- Intelligent Query Processing features like Adaptive Query Processing, Batch Mode processing on rowstores, and Scalar UDF Inlining which improve query performance.
- Approximate Count Distinct, a new function that provides an estimated count of distinct values in a column faster than a precise count.
- Lightweight profiling, enabled by default, which provides query plan
This document discusses designing a modern data warehouse in Azure. It provides an overview of traditional vs. self-service data warehouses and their limitations. It also outlines challenges with current data warehouses around timeliness, flexibility, quality and findability. The document then discusses why organizations need a modern data warehouse based on criteria like customer experience, quality assurance and operational efficiency. It covers various approaches to ingesting, storing, preparing, modeling and serving data on Azure. Finally, it discusses architectures like the lambda architecture and common data models.
The document provides details about an SQL expert's background and certifications. It summarizes the expert's career starting in 1982 working with computers and 1988 starting in the computer industry. In 1996, they started working with SQL Server 6.0 and have since earned multiple Microsoft certifications. The expert now provides training and consultation services, and created an online school called SQL School Greece to teach SQL Server.
The document provides biographical information about Antonios Chatzipavlis, a SQL Server expert and evangelist. It then summarizes his presentation on statistics and index internals in SQL Server, which covers topics like cardinality estimation, inspecting and updating statistics, index structure and types, and identifying missing indexes. The presentation includes demonstrations of analyzing cardinality estimation and picking the right index key.
This document provides an introduction and overview of Azure Data Lake. It describes Azure Data Lake as a single store of all data ranging from raw to processed that can be used for reporting, analytics and machine learning. It discusses key Azure Data Lake components like Data Lake Store, Data Lake Analytics, HDInsight and the U-SQL language. It compares Data Lakes to data warehouses and explains how Azure Data Lake Store, Analytics and U-SQL process and transform data at scale.
This document provides an overview of Azure SQL Data Warehouse. It discusses what Azure SQL Data Warehouse is, how it is provisioned and scaled, best practices for designing tables in Azure SQL DW including distribution keys and data types, and methods for loading and querying data including PolyBase and labeling queries for monitoring. The presentation also covers tuning aspects like statistics, indexing, and resource classes.
This document provides an introduction and overview of Azure DocumentDB. It discusses how DocumentDB is a fully managed NoSQL database service that provides fast and predictable performance for JSON data through SQL querying capabilities. It also describes how DocumentDB offers features like elastic scaling, high availability, global distribution and ease of development. The document then provides information on starting with DocumentDB, writing queries, and programming capabilities within DocumentDB like stored procedures and triggers.
This document provides an overview of auditing data access in SQL Server. It discusses various methods for auditing such as using common criteria, SQL Trace, DML triggers, temporal tables, and implementing SQL Server Audit. SQL Server Audit is described as the primary auditing tool in SQL Server that can track both server and database level events. Considerations for implementing and managing SQL Server Audit are also covered.
SQL Server 2016 introduces new features for business intelligence and reporting. PolyBase allows querying data across SQL Server and Hadoop using T-SQL. Integration Services has improved support for AlwaysOn availability groups and incremental package deployment. Reporting Services adds HTML5 rendering, PowerPoint export, and the ability to pin report items to Power BI dashboards. Mobile Report Publisher enables developing and publishing mobile reports.
Introduction to Data Science
1.1 What is Data Science, importance of data science,
1.2 Big data and data Science, the current Scenario,
1.3 Industry Perspective Types of Data: Structured vs. Unstructured Data,
1.4 Quantitative vs. Categorical Data,
1.5 Big Data vs. Little Data, Data science process
1.6 Role of Data Scientist
Big Data and Analytics Shaping the future of PaymentsRuchiRathor2
The payments industry is experiencing a data-driven revolution powered by big data and analytics.
Here's a glimpse into 5 ways this dynamic duo is transforming how we pay.
In essence, big data and analytics are playing a pivotal role in building a future filled with faster, more secure, and convenient payment methods for everyone.
DESIGN AND DEVELOPMENT OF AUTO OXYGEN CONCENTRATOR WITH SOS ALERT FOR HIKING ...JeevanKp7
Long-term oxygen therapy (LTOT) and novel techniques of evaluating treatment efficacy have enhanced the quality of life and decreased healthcare expenses for COPD patients.
The cost of a pulmonary blood gas test is comparable to the cost of two days of oxygen therapy and the cost of a hospital stay is equivalent to the cost of one month of oxygen therapy, long-term oxygen therapy (LTOT) is a cost-effective technique of treating this disease.
A small number of clinical investigations on LTOT have shown that it improves the quality of life of COPD patients by reducing the loss of their respiratory capacity. A study of 8487 Danish patients found that LTOT for 1524 hours per day extended life expectancy from 1.07 to 1.40 years.
Combined supervised and unsupervised neural networks for pulse shape discrimi...Samuel Jackson
Our methodology for pulse shape discrimination is split into two steps. Firstly, we learn a model to discriminate between pulses using "clean" low-rate examples by removing pile-up & saturated events. In addition to traditional tail sum discrimination, we investigate three different choices for discrimination between γ-pulses, fast, thermal neutrons. We consider clustering the pulses directly using Gaussian Mixture Modelling (GMM), using variational autoencoders to learn a representation of the pulses and then clustering the learned representation (VAE+GMM) and using density ratio estimation to discriminate between a mixed (γ + neutron) and pure (γ only) sources using a multi-layer perceptron (MLP) as a supervised learning problem.
Secondly, we aim to classify and recover pile-up events in the < 150 ns regime by training a single unified multi-label MLP. To frame the problem as a multi-label supervised learning method, we first simulate pile-up events with known components. Then, using the simulated data and combining it with single event data, we train a final multi-label MLP to output a binary code indicating both how many and which type of events are present within an event window.
Towards an Analysis-Ready, Cloud-Optimised service for FAIR fusion dataSamuel Jackson
We present our work to improve data accessibility and performance for data-intensive tasks within the fusion research community. Our primary goal is to develop services that facilitate efficient access for data-intensive applications while ensuring compliance with FAIR principles [1], as well as adoption of interoperable tools, methods and standards.
The major outcome of our work is the successful creation and deployment of a data service for the MAST (Mega Ampere Spherical Tokamak) experiment [2], leading to substantial enhancements in data discoverability, accessibility, and overall data retrieval performance, particularly in scenarios involving large-scale data access. Our work follows the principles of Analysis-Ready, Cloud Optimised (ARCO) data [3] by using cloud optimised data formats for fusion data.
Our system consists of a query-able metadata catalogue, complemented with an object storage system for publicly serving data from the MAST experiment. We will show how our solution integrates with the Pandata stack [4] to enable data analysis and processing at scales that would have previously been intractable, paving the way for data-intensive workflows running routinely with minimal pre-processing on the part of the researcher. By using a cloud-optimised file format such as zarr [5] we can enable interactive data analysis and visualisation while avoiding large data transfers. Our solution integrates with common python data analysis libraries for large, complex scientific data such as xarray [6] for complex data structures and dask [7] for parallel computation and lazily working with larger that memory datasets.
The incorporation of these technologies is vital for advancing simulation, design, and enabling emerging technologies like machine learning and foundation models, all of which rely on efficient access to extensive repositories of high-quality data. Relying on the FAIR guiding principles for data stewardship not only enhances data findability, accessibility, and reusability, but also fosters international cooperation on the interoperability of data and tools, driving fusion research into new realms and ensuring its relevance in an era characterised by advanced technologies in data science.
[1] Wilkinson, M., Dumontier, M., Aalbersberg, I. et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci Data 3, 160018 (2016) https://doi.org/10.1038/sdata.2016.18
[2] M Cox, The Mega Amp Spherical Tokamak, Fusion Engineering and Design, Volume 46, Issues 2–4, 1999, Pages 397-404, ISSN 0920-3796, https://doi.org/10.1016/S0920-3796(99)00031-9
[3] Stern, Charles, et al. "Pangeo forge: crowdsourcing analysis-ready, cloud optimized data production." Frontiers in Climate 3 (2022): 782909.
[4] Bednar, James A., and Martin Durant. "The Pandata Scalable Open-Source Analysis Stack." (2023).
[5] Alistair Miles (2024) ‘zarr-developers/zarr-python: v2.17.1’. Zenodo. doi: 10.5281/zenodo.10790679
[6] Hoyer, S. & Hamman, J., (20
The Rise of Python in Finance,Automating Trading Strategies: _.pdfRiya Sen
In the dynamic realm of finance, where every second counts, the integration of technology has become indispensable. Aspiring traders and seasoned investors alike are turning to coding as a powerful tool to unlock new avenues of financial success. In this blog, we delve into the world of Python live trading strategies, exploring how coding can be the key to navigating the complexities of the market and securing your path to prosperity.
4. PresenterInfo
1982 I started working with computers
1988 I started my professional career in computers industry
1996 I started working with SQL Server 6.0
1998 I earned my first certification at Microsoft as
Microsoft Certified Solution Developer (3rd in Greece)
1999 I started my career as Microsoft Certified Trainer (MCT) with
more than 30.000 hours of training until now!
2010 I became for first time Microsoft MVP on Data Platform
I created the SQL School Greece www.sqlschool.gr
2012 I became MCT Regional Lead by Microsoft Learning Program.
2013 I was certified as MCSE : Data Platform
I was certified as MCSE : Business Intelligence
2016 I was certified as MCSE: Data Management & Analytics
Antonios
Chatzipavlis
SQL Server Expert
SQL Server Evangelist
Data Platform MVP
MCT, MCSE, MCITP, MCPD, MCSD, MCDBA,
MCSA, MCTS, MCAD, MCP, OCA, ITIL-F
5. SQLschool.gr
Μια πηγή ενημέρωσης για τον Microsoft SQL Server προς τους Έλληνες IT
Professionals, DBAs, Developers, Information Workers αλλά και απλούς
χομπίστες που απλά τους αρέσει ο SQL Server.
Help line : help@sqlschool.gr
• Articles about SQL Server
• SQL Server News
• SQL Nights
• Webcasts
• Downloads
• Resources
What we are doing here Follow us in socials
fb/sqlschoolgr
fb/groups/sqlschool
@antoniosch
@sqlschool
yt/c/SqlschoolGr
SQL School Greece group
S E L E C T K N O W L E D G E F R O M S Q L S E R V E R
6. ▪ Sign up for a free membership today at sqlpass.org.
▪ Linked In: http://www.sqlpass.org/linkedin
▪ Facebook: http://www.sqlpass.org/facebook
▪ Twitter: @SQLPASS
▪ PASS: http://www.sqlpass.org
8. PRESENTATIONCONTENT
• Introduction
• Data protection in Microsoft SQL family to help address
GDPR requirements
• Built-in Microsoft SQL technologies that can help address
GDPR compliance
• Summary
10. • In this age of digital transformation, protecting privacy and enhancing security
have become top of mind.
• The upcoming EU General Data Protection Regulation (GDPR) sets a new bar for
privacy rights, security, and compliance.
• The GDPR mandates many requirements and obligations on organizations across
the globe.
• Complying with this regulation will necessitate significant investments in data
handling and data protection for a very large number of organizations.
• Microsoft SQL customers who are subject to the GDPR, whether managing cloud-
based or on-premises databases or both, will need to ensure that qualifying data in
their database systems is aptly handled and protected according to GDPR
principles.
ABS TR ACT
11. • On May 25, 2018, a European privacy law is due to take effect that sets a new
global bar for privacy rights, security, and compliance.
- The General Data Protection Regulation, or GDPR, is fundamentally about protecting
and enabling the privacy rights of individuals.
• The GDPR establishes
- strict global privacy requirements
- governing how personal data is managed and protected,
- while respecting individual choice.
TH E GDPR AND I TS I MPLI CATI ONS
12. • To achieve its objectives, the GDPR introduces several specific requirements related
to the rights of individuals, such as
- the right to access their personal data,
- correct inaccuracies,
- erase data,
- object to processing of their data,
- and the right to obtain a copy of their data.
• The GDPR also seeks to ensure
- personal data is protected no matter where it is sent,
- processed,
- or stored.
TH E GDPR AND I TS I MPLI CATI ONS
13. OBLI GATI ONS S PECI FI CALLY I NTR ODUCED BY TH E GDPR
Article 25
Data protection by
design and default
Control exposure to personal
data.
Control who is accessing
data and how.
Minimize data being
processed in terms of amount
of data collected, extent of
processing, storage period,
and accessibility.
Include safeguards for control
management integrated
into processing.
Article 30
Records of processing
activities
Log and monitor operations
Maintain an audit record of
processing activities on
personal data.
Monitor access to processing
systems.
Article 32
Security of processing
Security mechanisms to
protect personal data.
Employ pseudonymization
and encryption.
Restore availability and
access in the event of an
incident.
Provide a process for
regularly testing and
assessing effectiveness of
security measures.
Article 33
Notification of a
personal data breach to
the supervisory
authority
Detect and notify of breach in
a timely manner (72 hours).
Detect breaches.
Assess impact on and
identification of personal
data records concerned.
Describe measures to address
breach
Article 35
Data protection impact
assessment
Document risks and security
measures.
Describe processing
operations, including their
necessity and proportionality.
Assess risks associated with
processing.
Apply measures to address
risks and protect personal
data, and demonstrate
compliance with the GDPR.
14. • Name
• Identification number
• Email address
• Online user identifier
• Social media posts
• Physical information
• Physiological information
PER S ONAL DATA I N S COPE OF TH E R EGULATI ON
Personal data in scope of the regulation can include, but is not limited to, the following:
• Genetic information
• Medical information
• Location
• Bank details
• IP address
• Cookies
16. Microsoft SQL offers many built-in security capabilities that can help
reduce risks to data and improve the protection and manageability of
data at the database level and beyond. This includes:
• Microsoft SQL Server
- whether on-premises or hosted in a public cloud platform
• Microsoft Analytics Platform System
• Azure SQL Database
• Azure SQL Data Warehouse
M I C R O S O F T D A T A P L A T F O R M A S A H U B O F P R I V A T E D A T A A N D S E N S I T I V E I N F O R M A T I O N
17. • Discover
- Identify what personal data is being managed
and where it resides.
• Manage
- Govern how personal data is used and
accessed.
• Protect
- Establish security controls to prevent, detect,
and respond to vulnerabilities and data
breaches.
• Report
- Keep required documentation, manage data
requests, and provide breach notifications.
FOUR K EY S TEPS TO ACH I EVE GDPR COMPLI ANCE
19. • Metadata Queries
- such as analyzing all column names via querying sys.columns, to identify column names which
potentially contain personal data such as “Name”, “Birthdate”, “ID number”, etc.
• Advanced Discovery
- it is possible to use Full-Text Search in Microsoft SQL to search for keywords located within
freeform text.
- additionally, sensitive data can be tagged using Extended Properties to add sensitivity labels to
relevant columns.
• Identify and understand the current data access policies
• Disable all features that are not in use to reduce the attack surface area
• Disable network protocols that are not in use
• Turn off the SQL Server browser service
• Uninstall sample databases.
DI S COVER I NG AND CLAS S I FYI NG PER S ONAL DATA AND I TS ACCES S VECTOR S
20. MANAGING ACCESS AND CONTROLLING HOW DATA IS USED
AND ACCESSED
• Authentication in SQL Server
• Authorization
• Azure SQL Database Firewall
• Authentication in Azure SQL Database using Azure Active Directory
• Dynamic Data Masking
• Row-Level Security
21. SQL Server supports two
authentication modes:
Windows authentication mode
Mixed mode.
Ensure that only authorized users
with valid credentials can access
the database server.
AUTH ENTI CATI ON I N S QL S ER VER
Why is this important for the GDPR?
The GDPR talks about ensuring the
security of personal data, “including for
preventing unauthorized access to or use
of personal data and the equipment used
for the processing.”
GDPR Recital 39
Recommendation Benefits
Use Windows
authentication
• Enables centralized management of SQL Server principals via
Active Directory.
• Uses Kerberos security protocol to authenticate users.
• Supports built-in password policy enforcement including
complexity validation for strong passwords, password
expiration, and account lockout.
Use separate
accounts to
authenticate
users and
applications
• Enables limiting the permissions granted to users and
applications.
• Reduces the risks of malicious activity such as SQL injection
attacks.
Use contained
database users
• Isolates the user or application account to a single database.
• Improves performance, as contained database users
authenticate directly to the database without an extra
network hop to the master database.
• Supports both SQL Server and Azure SQL Database, as well
as Azure SQL Data Warehouse.
22. • Define a proper authorization policy
• Database role memberships and object-level
permissions
• Implement a proper separation of duties model
• Create accounts/roles for high-privileged
operations
• Create account/roles for applications or users to
perform day-to-day tasks.
• Access policies should abide by the principle of
least privilege
• Microsoft SQL-based technologies provides
mechanisms to define granular object-level
permissions, and simplify the process by
implementing role-based security
• Granting permissions to roles rather than users
simplifies security administration
AUTH OR I ZATI ON
Why is this important for the GDPR?
The GDPR specifically addresses the need
for mechanisms which limit access to
data, requiring “measures [that] shall
ensure that by default personal data are
not made accessible without the
individual's intervention to an indefinite
number of natural persons.”
GDPR Article 25(2)—“Data protection
by design and by default.”
23. • A firewall is automatically set up to help protect the data
• The firewall initially prevents all access to the database server until explicit access permissions are specified,
based on the originating IP address of each request
• By default, there is an exception to allow all Azure services to connect to the database, to ensure full
functionality across Azure services (However, there is an option to disable this exception in the firewall settings)
AZUR E S QL DATABAS E FI R EW ALL
Guidelines to properly configure firewall settings:
• In the “allowed IP addresses” entry field, enter the IP
address range that allows connections to trusted
users and services.
• Avoid using broad IP ranges as this defeats much of
the protection provided by firewall rules.
• Follow the least privilege principle by restricting
firewall settings for the IP address range to trusted IP
addresses.
Why is this important for the GDPR?
The GDPR specifically addresses the need for
mechanisms which limit access to data,
requiring “measures [that] shall ensure that by
default personal data are not made accessible
without the individual's intervention to an
indefinite number of natural persons.”
GDPR Article 25(2)—“Data protection by
design and by default.”
24. Use Azure AD Authentication benefits:
• Reduces the proliferation of user identities across database servers.
• Allows password rotation in a single place.
• Enables managing database permissions using external (Azure AD) groups.
• Enables integrated Windows authentication and other forms of authentication
supported by Azure Active Directory.
• Azure AD authentication uses contained database users to authenticate identities
at the database level (considered more secure).
• Supports token-based authentication for applications connecting to SQL Database.
• Supports Active Directory Federation Services (ADFS) or native user/password
authentication for a local Azure Active Directory without domain synchronization.
AUTH ENTI CATI ON I N AZUR E S QL DATABAS E US I NG AZUR E ACTI VE DI R ECTOR Y
25. • It’s a build-in feature
• Is supported starting from SQL Server 2016 (all editions) and for Azure
SQL Database.
• DDM limits sensitive data exposure by masking the data to non-
privileged users or applications
• DDM allows the DBA
- to select and mask a particular table-column that contains sensitive data
- To designate which DB users are privileged and should have access to the
real data.
• Once configured, any query on that table/ column will contain masked
results, except for queries run by privileged users.
• DDM works by masking the data on the fly with minimal impact on the
application layer.
• The data in the database remains intact and can be accessed by users
with appropriate privileges.
• Applying DDM does not explicitly require any application changes, so it
is easy to use with existing applications.
DYNAMI C DATA MAS K I NG
Why is this important for the GDPR?
The GDPR calls for implementing “appropriate
technical and organizational measures, such as
pseudonymization, which are designed to
implement data-protection principles, such as
data minimization, in an effective manner…”
GDPR Article 25(2)—“Data protection by
design and by default.”
26. • It’s a build-in feature
• Is supported starting from SQL Server 2016
(all editions) and for Azure SQL Database.
• Restrict access according to specific user
entitlements
• The access restriction logic is in the database
tier rather than away from the data in
another application tier.
• The restrictions are applied every time that
data access is attempted from any tier.
• This makes the security system more reliable
and robust by reducing the system’s surface
area.
R OW - LEVEL S ECUR I TY
Why is this important for the GDPR?
Users have access only to the data that is
pertinent to them, thus reducing the risk of
“unauthorized disclosure of, or access to
personal data.”
GDPR Article 32(2)—“Security of processing.”
27. PROTECTING PERSONAL DATA AGAINST SECURITY THREATS
• Transport Layer Security
• Transparent Data Encryption
• Always Encrypted
• Auditing for Azure SQL Database
• SQL Threat Detection
• SQL Server Audit
• Business continuity—SQL Server Always On
• Business continuity in Azure SQL technologies
28. Protecting personal data against security threats is specifically declared as a core requirement of the GDPR.
The GDPR requires that organizations implement “Data protection by design and by default” GDPR Article 25.
GDPR Article 32(1)—“Security of processing”
States that an organization must implement appropriate technical and organizational measures to ensure a level of
security appropriate to the risk, including inter alia as appropriate:
• The pseudonymization and encryption of personal data.
• The ability to ensure the ongoing confidentiality, integrity, availability, and resilience of processing systems and
services.
• The ability to restore the availability and access to personal data in a timely manner in the event of a physical or
technical incident.
• A process for regularly testing, assessing, and evaluating the effectiveness of technical and organizational measures
for ensuring the security of the processing.
There is also a requirement of “notification of a personal data breach” within a specified window of 72 hours, where
feasible, after the organization becomes aware of it.
GDPR Article 33(1)—“Notification of a personal data breach to the supervisory authority.”
W H Y I S TH I S I MPOR TANT FOR TH E G DPR ?
29. • It is a best practice to always use connections secured with
Transport Layer Security (TLS).
- This ensures that data is encrypted in transit to and from the database,
and reduces susceptibility to “man-in-the-middle” attacks.
- SQL Server and Azure SQL Database have TLS1.2 support enabled, and
this is the recommended protocol to use for highly secure
communication.
• To enable encrypted connections in SQL Server:
- Provision a certificate on the server,
- Configure the server to accept encrypted connections,
- Configure the client to request encrypted connections.
- To enforce that all client connections must be encrypted, set the ForceEncryption flag
to Yes.
• For Azure SQL Database and Azure SQL Data Warehouse, all
connections require encryption at all times.
- To securely connect with a client, specify connection string parameters
(for ADO.NET driver) Encrypt=True and TrustServerCertificate=False.
TR ANS POR T LAYER S ECUR I TY
Why is this important for the GDPR?
GDPR talks about integrating the
“necessary safeguards into the
processing” (GDPR Article 25(1),
“Data protection by design and by
default”), and accounting for risks
presented by processing, “in
particular from accidental or
unlawful destruction, loss,
alteration, unauthorized disclosure
of, or access to personal data
transmitted, stored or otherwise
processed” (GDPR Article 32(2),
“Security of processing”).
Protecting data during
transmission is explicitly called out
in this context—to avoid possible
leakage and minimize these risks.
30. • TDE provides encryption of data at rest
• Is a built-in Microsoft SQL feature
• TDE addresses the scenario of protecting the data at the
physical storage layer.
• TDE performs real-time encryption and decryption of the
database, associated backups, and transaction log files without
requiring changes to the application
• TDE is often required by compliance regulations and various
industry guidelines as a core requirement, as it ensures the full
protection of data at the physical layer.
- Note that with TDE, data tables are not directly encrypted.
• TDE protects the physical data files and transaction log files.
- If these are moved to another server, for instance, they cannot be opened
and viewed on that server.
• TDE is straightforward to enable—and applies to the entire
database.
TR ANS PAR ENT DATA ENCR YPTI ON
Why is this important for the GDPR?
This relates to the GDPR obligation
to take into account “risks that are
presented by processing, in
particular from accidental or
unlawful destruction, loss,
alteration, [or] unauthorized
disclosure of” that data. In this
case, the protection is at the level
of the physical device—and
prevents the risk of compromising
the storage itself, for example via
copying the physical data out to
another server.
GDPR Article 32(2)—“Security of
processing.”
31. • Microsoft SQL databases (Azure SQL Database
and SQL Server 2016) offer an industry-first
security feature called Always Encrypted, which is
designed to protect highly sensitive data
• Always Encrypted allows customers to encrypt
sensitive data inside client applications and never
reveal the encryption keys to the database engine
(SQL Database or SQL Server)
• Always Encrypted provides a separation between
those who own the data (and can view it) and
those who manage the data (but should have no
access).
• An advantage of Always Encrypted is that it makes
encryption transparent to applications
• There are two encryption types supported:
deterministic encryption and randomized
encryption
ALW AYS ENCR YPTED
Why is this important for the GDPR?
The use of this powerful encryption feature which
better ensures that highly sensitive data is
encrypted on the server side, even in-memory, can
significantly help in meeting the GDPR requirements
around “Security of processing.” This particularly
applies to the requirement to “implement
appropriate technical and organizational measures
to ensure a level of security appropriate to the risk”
including, as appropriate, “the pseudonymization
and encryption of personal data”.
GDPR Article 32(2) – “Security of processing.”
32. • Auditing for Azure SQL Database tracks database activities by
writing events to an audit log.
• It enables the customer to understand ongoing database activities,
as well as analyze and investigate historical activity to identify
potential threats or suspected abuse and security violations.
• Auditing can be enabled in the Auditing pane in the Azure portal
• A retention period can be enabled for audit logs according to
specific requirements
• The audit data can be analyzed to investigate specific issues using
- The fn_get_audit_file function,
- The Azure Portal Audit Records blade,
- SQL Server Management Studio (SSMS),
- Azure Storage Explorer.
- Microsoft Operations Management Suite (OMS) Log Analytics
AUDI TI NG FOR AZUR E S QL DATABAS E
Why is this important for the GDPR?
The GDPR, as part of the data
protection requirement, stipulates
a requirement that “Each
controller… shall maintain a record
of processing activities under its
responsibility.”
GDPR Article 30(1)—“Records of
processing activities.”
33. • SQL Threat Detection is a built-in feature
• Detects anomalous database activities indicating
potential security threats to the database.
• The essence of this service is to proactively notify
the Azure database administrator (or subscription
owners) of any suspicious activity that could
indicate a possible malicious intent to access,
breach, or exploit data in the database.
• Threat Detection operates by continuously
profiling and monitoring application behavior, and
employing machine learning and behavioral
analytics methodologies to detect anomalies and
unusual behavior
S QL TH R EAT DETECTI ON
Why is this important for the GDPR?
The GDPR has a clear requirement
regarding data breaches: “In the case
of a personal data breach, the
controller shall without undue delay
and, where feasible, not later than 72
hours after having become aware of
it, notify the personal data breach to
the supervisory authority.”
GDPR Article 33(1)—“Notification of
a personal data breach to the
supervisory authority.”
34. • SQL Server also contains an auditing capability
that enables tracking activities on an on-premises
database,
• SQL Server Audit enables the customer to
understand ongoing database activities, and
analyze and investigate historical activity to
identify potential threats or suspected abuse and
security violations.
• SQL Server Audit enables the creation of server
audits, which can contain
- server audit specifications for server level events,
- database audit specifications for database level events.
• Audited events can be written to the event logs or
to audit files.
- Granular control is available to specify exactly what events
to audit, aligning with specific needs.
S QL S ER VER AUDI T
Why is this important for the GDPR?
The GDPR requires that “Each
controller … shall maintain a record of
processing activities under its
responsibility.”
GDPR Article 30(1)—“Records of
processing activities.”
35. • One important element of protecting data
concerns ensuring its resiliency and
availability in the event of an adverse
incident
• Always On Availability Groups
- supports a failover environment for a discrete set
of user databases
- supports a set of read-write primary databases and
one to eight sets of corresponding secondary
databases
• Always On Failover Cluster Instances
BUS I NES S CONTI NUI TY —S QL S ER VER ALW AYS ON
Why is this important for the GDPR?
The GDPR explicitly refers to this
important aspect of protecting data,
by requiring that the organization
“implement appropriate technical and
organizational measures” that take
into consideration, as appropriate,
“the ability to restore the availability
and access to personal data in a
timely manner in the event of a
physical or technical incident”.
GDPR Article 32(1)—“Security of
processing.”
36. • The Azure SQL Database service performs automated database backups
periodically, and the service guarantees Point-in-Time Restore from these backups
for a certain scope of recovery.
• Long-term retention for backups is also available, by storing Azure SQL Database
backups in an Azure Recovery Services vault for up to ten years.
• Azure SQL Database also offers an Active Geo-Replication feature, which provides a
database-level recovery solution with low recovery time.
• Active Geo-Replication enables the configuration of up to four readable secondary
databases in the same or different regions.
• Beyond offering a complete business continuity and disaster recovery solution,
Active Geo-Replication also enables load-balancing using the secondary database,
and offloading read-only workloads.
• It also supports user-controlled failover and failback and configurable performance
levels for the secondary databases.
BUS I NES S CONTI NUI TY I N AZUR E S QL TECH NOLOGI ES
38. • The final phase of the methodology for
GDPR compliance addresses the need to
maintain a record of personal data
processing activities under the controller’s
responsibility, and to make the record
available to the supervisory authority upon
request.
• This phase also deals with the continuous
process of reviewing the controls and
security of the system, to better ensure
ongoing compliance with GDPR principles.
• Microsoft SQL Auditing capabilities
• Temporal Tables
R EPOR TI NG ON DATA PR OTECTI ON POLI CI ES AND R EVI EW I NG R EGULAR LY
Why is this important for the GDPR?
The GDPR requires an assessment of
the impact of processing operations
on the protection of personal data
where such processing is likely to
result in high risk. This includes an
assessment on “the measures
envisaged to address the risks,
including safeguards, security
measures and mechanisms to ensure
the protection of personal data.”
GDPR Article 35(7)— “Data
protection impact assessment.”
39. Operations Management Suite SQL Assessment.
This is primarily used for monitoring purposes.
It also performs a limited set of security checks to review the current security
status of the database environment.
R EGULAR LY R EVI EW TH E S ECUR I TY S TATE OF DATA AND S YS TEMS
41. Microsoft SQL Product Team
Helping discover, classify, and protect sensitive data.
Tracking personal or sensitive data throughout the database system and
beyond.
Assessing database vulnerabilities and overall security state.
Helping meet security best practices with controls and hardening
recommendations
Organizations
Need to invest significantly to ensure the GDPR principles are effectively
implemented and sustained in their environments.
S UMMAR Y