We're #hiring a new SQL Architect in Plano, Texas. Apply today or share this post with your network.
ConnectedX Inc.’s Post
More Relevant Posts
-
Hey Connections! We are hiring for the mention below positions with 3 to5 years of Experience let me know if any one looking for the new opportunity . send your resumes to spandana.r@biztronsoftech.com SQL Developer: In-depth knowledge of SQL syntax, database concepts, and query optimization techniques is crucial for effective data retrieval and manipulation. Skills in optimizing query performance, indexing strategies, and database tuning help ensure fast and efficient data retrieval. Additionally understanding how to design and implement database schemas, normalization techniques, and relational database concepts is vital for efficient data organization #sqldeveloper #sqldeveloper #sql #sqlserver #sqldatabase #database #developer #mysql #
To view or add a comment, sign in
-
Dive into the world of database magic with our latest blog! Discover the key trends in SQL development and learn how hiring an SQL developer can supercharge your projects. 🚀 Explore the future of data management! . . . Also Read: https://bit.ly/3M5ZzSD #Xcoder #Database #SQLSkills #SQLProgrammer #DatabaseManagement #TechBlog #CodingLife #DataScience #SQLServer #DataManagement #WebDevelopment #TechnologyTrends #SoftwareDevelopment #DatabaseDesign
To view or add a comment, sign in
-
-
Team Manager | IT & DevOps Enthusiast | Telecom IT Infrastructure Automation & Administration | CKA, ITIL, PMP, RHCE | Delivering Scalable Solutions
1. "How many backup admins does it take to screw in a lightbulb? None, they'd just restore from a previous save point." 2. Why don’t we let database administrators join our band? Because they always want to do a “join” on every “table”! 😄 3. "What do you call a big data analyst who's lost their job? Unemployed, with a huge dataset of skills." 4. "Why did the DBA cross the road? To get to the non-nullable side!" 5. Why did the storage admin get promoted? ...Because they could finally explain the difference between a terabyte and a thunderbolt! 6. What's a manager's favorite motivational quote? 'The buck stops... one desk over.'" 7. "Why was the computer cold at the office? Because it left its Windows open! 8. "Why did the software developer go broke? Because he used up all his cache!" #DebuggingLife #CodingHumor #CoffeeAndCode #ProgrammersBeLike #JustITThings #TechPuns #OfficeAntics #DataDrama #humor #IThumor #Jokes
To view or add a comment, sign in
-
We are hiring below position, Please share your CV. Candidate Must be local to Mclean, VA, Exp: 3+, Minimum, Job Description: Title: Snowflake Admin Location: Mclean VA – 3 days/week Duration: 6-12+ Months – Possible Extension Supplier Vetting Questions - 1) Please describe your most recent development experience Snowflake administration . What was the project? What was your contribution? What tools did you use? 2) Please describe your most recent experience with SQL. What was the project? What was your contribution?" Required Skills: Experience with Snowflake admin. Be able to Identify the various aspects of compute and storage management in Snowflake. Illustrate administrative tasks within the detailed architecture of Snowflake. Employ recovery methods and agile development with Time Travel & Cloning Describe the DDL operations that work with fundamental database objects. Use account-level replication for failover scenarios. Email: nataraj@talnq.com
To view or add a comment, sign in
-
Hello LinkedIn Fam, I am looking for Senior Database Administrator/Architect Duration: 6months Location: Remote – EST/CST Mode of interview: Video What is driving the need? Need someone to help re-architect and redesign the way this database is organized to eventually help build a new reporting platform. Come in and help optimize database and queries to help with reporting functionality This is critical to fix and implement TOP SKILLS REQUIRED SIZZLES – PLEASE INCLUDE 4 BULLETS HIGHLIGHTING THE SKILL SET REQUIRED. THANK YOU! 1: 8-12 years experience 2: Reporting & performance tuning 3: PL/SQL, MySQL 4: AWS – how to scale, set up clusters CLIENTS FORMAL JOB DESCRIPTION Short Term : Review & propose. Schemas and database server, in the queries being executed. Propose solution for SQL long queries. SQL queries to improve performance. Monthly Reports solutions for all equipment at customer level. Design database structure for planned projects like file Management and Smart Tag. Optimize Queries - Ensure that queries are properly optimized. Tables and create indexing. Ensure that tables are indexed on columns frequently used in WHERE clauses, JOIN conditions, and ORDER BY clauses. Adjust database configuration parameters (such as memory allocation, cache sizes, and connection limits) to match the hardware resources and workload of our system. Add partitioning and sharding to distribute load if required. Large tables, consider partitioning them based on certain criteria (such as range or hash partitioning). Evaluate database schema and consider normalizing or de-normalizing it based on the Current KC requirements. Long Term : Reporting Strategy Purpose reporting tool for automated report and dashboard. Establish a reporting cadence that provides timely insights without overwhelming stakeholders. Balance frequency with the availability of relevant data and the need for real-time decision-making. Explore various reporting tools available in the market. Consider factors such as functionality, ease of use, scalability, cost, integrations with existing systems, and support for data visualization. Please reach out me at Alisha@kayconnect.net Hangout: Alisharek18@gmail.com Yashu Srivastava Himanshu Verma Hariom Narayan Vibhuti Baluni #database #architect #remote #sqldba
To view or add a comment, sign in
-
share cv 7017990795 5 years of relevant experience. Familiarity with Data lake, Data migration, and Data integration concepts. Knowledge of OLAP , OLTP systems. Technical Skills: understand and translate business requirements into design. Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift. Experience in creating ETL jobs using Python/PySpark. Proficiency in creating AWS Lambda functions for event-based jobs. Knowledge of automating ETL processes using AWS Step Functions. Competence in building data warehouses and loading data into them. Responsibilities: Understand business requirements and translate them into design. Assess AWS infrastructure needs for development work. Develop ETL jobs using Python/PySpark to meet requirements. Implement AWS Lambda for event-based tasks. Automate ETL processes using AWS Step Functions. Build data warehouses and manage data loading. Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks. Communicate project work and document accordingly. Act as a team player and provide technical assistance to the team and customers. Adhere to customer release management processes for release activities. #dataengineer
To view or add a comment, sign in
-
Blending Warrior and Love Mindsets to Uncover Purpose, Direction, and Motivation for Your Happiest Life.
Opportunity with JSO
#hotjobalert #imhiring for a Database Developer! #makeadifference in #jacksonvillefl #jacksonvillejobs #databasedeveloper #itjobopportunity #applytoday https://lnkd.in/gW5XtPqG
To view or add a comment, sign in
-
Hello Connections!!! 🔍 Just completed an insightful project on Data Cleaning titled "World Layoffs - MySQL Tutorial" inspired by Alex the Analyst's YouTube channel! Here are some key takeaways from the journey: 🧹 **Essential Steps in Data Cleaning:** - Importing raw data with careful consideration of data format. - Copying data to a staging table for manipulation. 📅 **Dealing with Date Columns:** - Imported data with date columns, ensuring proper formatting. - Utilized functions to adjust date column formats. 🔍 **Identifying and Handling Duplicates:** - Leveraged row numbers and partitioning to identify and manage duplicates. - Explored strategies specific to MySQL for duplicate removal. 🛠️ **Creating and Managing Tables:** - Created new tables and managed data types efficiently. 🔄 **Updating and Cleaning Data:** - Adjusted settings for data updates and standardized labels for accurate analysis. 🌍 **Country-Specific Data Cleaning:** - Applied specific cleaning techniques tailored to country-specific data. ⚙️ **Advanced Operations:** - Explored converting data types, updating based on conditions, and joining tables. 🚮 **Removing Unwanted Data:** - Removed unnecessary columns and rows to streamline data. 💡 **Final Thoughts:** - Data cleaning is not just about tidying up; it's about discovering new insights. - This project is a valuable addition to any portfolio due to its depth and quality. Excited to apply these learnings to future projects! Thanks, Alex the Analyst, for the invaluable lessons. 🌟 #DataCleaning #MySQLTutorial #DataAnalysis #LearningJourney
To view or add a comment, sign in
-
Position: Technical Systems Analyst (SQL Developer + Snowflake) Location: Can be remote/need PST zone candidates only Visa : US Citizens/ GC Only Duration: Long Term Contract please share your resume at hr@ygtechllc.com Key Note: Very strong SQL and data management experience. Snowflake experience is required. Job Skills: 1.Advanced SQL Proficiency: They should have a strong command of SQL to query, manipulate, and join datasets efficiently. This skill is crucial for unifying the current contact data with 3rd party vendor data. 2.Data Validation: Ability to clean and standardize data, identify duplicates, and resolve inconsistencies is crucial for maintaining data quality. 3.Problem-Solving Skills: Strong problem-solving skills are necessary for optimizing processes and resolving technical challenges encountered during the project. 4.Data Skills: Investigate data problems and perform deep-dive analyses. 5.Ability to identify trends and interpret data. Should be able to summarize their findings in simple yet consumable structure (via slides/tables). Additional Skills: 1.Database Management: Understanding of database management principles, including indexing, performance optimization, and data security, is essential for managing the unified dataset efficiently. 2.Data Integration and ETL (Extract, Transform, Load): Experience with ETL processes is necessary for merging and transforming datasets from various sources. 3.The consultant should be capable of designing and implementing data integration pipelines to enrich existing contact data. (most relevant for TAU expansion) 4.Python Programming: Proficiency in Python is essential for developing scripts and automation tools to streamline the data enrichment process. 5.Knowledge of Regular Expression (Regex) to identify and manipulate text strings such as Job titles, names, and email domains is advantageous. #systemanalyst #technicalsystemanalyst #SQL #snowflake #phoenix #USC #GC #python #TAUExpansion
To view or add a comment, sign in