Sign in to view Arun’s full profile
Welcome back
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
Austin, Texas Metropolitan Area
Contact Info
Sign in to view Arun’s full profile
Welcome back
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
1K followers
500+ connections
Sign in to view Arun’s full profile
Welcome back
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
View mutual connections with Arun
Welcome back
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
View mutual connections with Arun
Welcome back
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
Sign in to view Arun’s full profile
Welcome back
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
About
Welcome back
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
Activity
Sign in to view Arun’s full profile
Welcome back
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
-
What an amazing week! I attended Dominican University’s Faculty and Staff Recognition event. I was recognized for my 20 years of dedicated…
What an amazing week! I attended Dominican University’s Faculty and Staff Recognition event. I was recognized for my 20 years of dedicated…
Liked by Arun Pattipaka
-
Had a great time at #NRF2024, showcasing SAP’s cutting-edge #BusinessAI strategy tailored for the retail industry through our immersive experiences…
Had a great time at #NRF2024, showcasing SAP’s cutting-edge #BusinessAI strategy tailored for the retail industry through our immersive experiences…
Liked by Arun Pattipaka
Experience & Education
-
KeyBank
** **** *******/ **** ********
-
*********
** **** ********* **********/ **** ********/ *** **** ***
-
****** *********
** **** *******/ *** **** ** *********
-
********* **********
******'* ****** ******** *********** ******* *+
-
-
********** ***** ************* **********
********'* ****** ******** *******
View Arun’s full experience
See their title, tenure and more.
Welcome back
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
or
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
View Arun’s full profile
Sign in
Stay updated on your professional world
By clicking Continue to join or sign in, you agree to LinkedIn’s User Agreement, Privacy Policy, and Cookie Policy.
New to LinkedIn? Join now
Other similar profiles
-
Suman P.
Director, Project Management
Dallas, TXConnect -
suma podila
San Francisco Bay AreaConnect -
Panos Kazantzis
LondonConnect -
Rob Cao
United KingdomConnect -
Srilatha Santhanagopalan
DBA at Coventry
Glen Allen, VAConnect -
Supriya Chandra
Analyst at Xerox Services
Glen Allen, VAConnect -
Alex Mattera
Senior Research Analyst
Richmond, VAConnect -
Charles Lee
New York, NYConnect -
Haribabu Mandadi
Software Engineer 2 at Microsoft
HyderabadConnect -
Frank Naylor
BPC Support Consultant at SAP
Jamison, PAConnect -
terry cates
Founder
Amherst, NHConnect -
Trishul Avinash Punyapu
Student at Texas A&M University
Liberty Hill, TXConnect -
Mahadevan Rengasamy
Systems Director
Glen Allen, VAConnect -
David Morrison PhD
Data Scientist
GlasgowConnect -
Maciej Prostak
WarsawConnect -
Judith Victor CNP, NCICS
Owner at Proper Billing & More, LLC
Port St Lucie, FLConnect -
John Adzovie
Akron, OHConnect -
Anees U Rana
Newington, CTConnect -
Noufal E.
Greater Seattle AreaConnect -
Rachel Zhao
Dallas-Fort Worth MetroplexConnect
Explore more posts
-
Shalini S.
Hi, Business Systems Analyst Remote Experience: 9 years No H1B, OPT,CPT For GC candidates need passport number Please share the suitable resume to john@vstconsulting.com Need Heavy Operations, solid understanding of what operations is doing to build out the process flows Client is finding out that when they integrate folks from M&A space that their process is much different than Celera’s and Celera’s process is broken Identify pain points and friction to fix issues in the process build out workflows technical background in order to build out workflows building roadmaps finding out where gaps are ticketing
1 Comment -
Kanchan Meena
Hello #Everyone we are #hiring below the roles. If you know anyone who might be interested. Please drop your resume to my email at Kanchan.m@globalitcon.com Role: Sr. Data Scientist Location: St. Louis MO Experience: 10+ Years Skills: AWS Sage maker, Python (Sci-kit, pandas, numpy etc), SQL, MLOps, dashboards / reports for business consumption, Ad-hoc analysis Job Type: C2C & W2 both Must have at least done PHD #datascientist #data #DataScienceOpportunity #SeniorDataScientist #StLouisJobs #MLOps #DataAnalysis #BusinessIntelligence #DataInsights #Python #Sagemaker #Classification #Regression #RecommendationEngine #AdHocAnalysis #c2c #w2 #DataVisualization #DashboardBuilding #BusinessCommunication
3
1 Comment -
Jitendra Dash
#Immediate Openings Data Architect at Remote (USA) (Any Visa Fine) Hello Folks, Hope you are doing well! Please go through below mentioned JD and revert me back with a suitable resume. Job Title: Data Architect Location: New Jersey preferred but remote Duration: Long Term contract on c2c ob Description Determine the scope of Data migration. Document the data migration strategy. Document Data Migration Protocol to go along with the data migration strategy. Produce the Data Migration Summary report based on the data migration strategy and protocol documents. Participate in finalizing the code migration strategy in conjunction to the data migration strategy. Co-ordinate moving of various transports from a WMS perspective. Support Mapping and Construction of Data from Legacy Systems Establish and Support Pre-Load and Post-Load Business Signoff Support Internal and External Quality Control and Audit Processes Lead Data Defect Triage and Remediation Scope/Identify/guide any remediation of incomplete/incorrect Data in Legacy Systems. Validation of all the data once its moved across the environments (DEVàPQAàQAàProd) Cutover planning and Mock cut-over coordination and fall over and fall out reconciliation working closely with Systems and Performance teams. Some of the points to be considered for cutover are: o Inclusion of reversion back to PROD original state post mock cutover exercises for first site o Plan for mock cutover for 2nd site as FDC will already be in PROD POST go-live monitoring of all data related activities and the whole cutover plan. Some of the other specific responsibilities will also involve the following: 1.Verification of Data Integrity between platforms – ERP, WMS, Serialization 2.Coordination of delivery/ creation of BY Test data in each environment – DEV, PQA, QA and Prod 3.Oversight of all things data migration environment to environment strategy – configuration, master and transactions as applicable a. Reconciliation of accelerator tool result for uplifts across platforms b. RCA for fill-out and follow-on corrective actions c. Updated approach review to not repeat history for each Component load to higher environments. 4.Data Conversion Strategy from lower to higher environments including BY configuration preservation where applicable and legacy MARC data conversion. a. LPN’s with serialized hierarchies b. Inventory (LPN) by location c. Orders by status i. Inbound ii. Outbound d. Orders not only by status including progress quantity details at header and at line item level e. Inventory by Location with LPN details regarding status, committed (all of the sub-tabs) f. Inventory reconciliation with ERP and Serialization g. Serialization Data h. Clinical Data Please share resume to jiten@sharpitco.con
12
7 Comments -
Sachin Singh
Hello Connections, One of our client is Hiring for Business Analyst and this is Onsite opportunity located in Phoenix/Scottsdale, AZ. Please find below job description and share your updated resume at recruiter@omsyst.com. (Please send Only Local Profile) Job Title: Business Analyst Work location: Phoenix, AZ Job Description: Onsite work in Phoenix/Scottsdale, AZ is required. Relevant industry experience is HIGHLY DESIRED - Specifically Manufacturing or Engineering to Order, prior experience with companies like- Applied Materials, ASML, LAM research, Horiba would be the MOST relevant. The client is involved in the manufacturing of semiconductors. The resource will be meeting with key stakeholder in that location. Primary responsibilities, skillset and sample deliverables expected of the Business Analyst. The Business Analyst is expected to engage end users of ASM and create design documents (components of which are listed under sample deliverables) that will enable developers to then develop on low code application platforms (LCAPs) like Mendix or Salesforce. The LCAP is being evaluated at this juncture. Key Responsibilities: ·Facilitate conversations with Business Unit end users to elicit problem statements and co-create use cases. ·Provide guidance and thought leadership on solutions for use cases. ·Produce low-fidelity wireframes to validate with end users and to inform Developers. ·Test prototypes with end users to elicit feedback and drive iterations. ·Establish and maintain product backlog with ASM Product Owner Key Skillset: Previous relevant industry experience= ex. Manufacturing Domain, engineering to order preferred. ·Design Thinking , Business Analysis, Design Principles, Agile Principles. Sample Deliverables: ·Product Backlog comprising User Stories, User Flow, Wireframes, ·Entity Relationship Diagram + Architecture Design (Working with Technical Lead) ·If you are interested in exploring this further, we will be happy to provide more details about the role and discuss the next steps. If you are interested in exploring this further, we will be happy to provide more details about the role and discuss the next steps. #c2c #Benchsales #BusinessAnalysis #BusinessAnalysit #BA #BSA #Manufacturing #Manufacturing #Engineering OMSYST
2
-
Manoj Darla
###Immediate Requirement ### Job Title: #Data Analyst Location: Bentonville, AR (Onsite) Duration: 7 Months Rate: $32/hr on C2C Job Description: The Sr. Data Analyst will provide and support the implementation of business solutions by building relationships, identifying business needs, and carrying out processes and practices. The role is in client Home-Office and will report to the Senior Manager, Item Catalog. • #Data Source Identification • #Data Strategy • Understanding Business Context • Tech. Problem Formulation • #Data Visualization • #Data Quality Management • Exploratory Data Analysis What you'll do: Data Collection and Analysis: • Address business problems by understanding the data source’s location, integrate the data, perform subsequent analysis using #SQL, #Python and subsequently visualize the findings on a Tableau dashboard. • Monitor the dashboard daily and flag any outliers that the data represents. • Conduct exploratory analysis and inform team and counterparts of the findings. Business Partnerships: • Prepare first draft of executive presentations to communicate strategic recommendations to management based on research and data analysis. • Review the findings based on the feedback received from partners and provide a insights based on the audience Model Development: • Build and manage parts of algorithmic products/models and ensure segment aligns with overall product goals after verification with the existing tools. • Use the existing client tools to verify the findings and run the simulations to test the thesis Proof of Concept and Prototype: • Generate new ideas as an output of research, apply knowledge to come up with new information, and contribute to proof-of-concepts for projects. Collaboration: • Collaborate with other associates from the team and work with managers/specialist cross functionally as needed. Project Scope: • Own small projects end-to-end to support the business area or category, and own a defined part of a product, ensure the product is working and functioning, and identify new information to feed back-up to product owner Technical Skillsets (must) • Python, SQL, GCP BQ, Tableau and Excel What you'll bring: Minimum education and experience required: Bachelor's degree or equivalent in Business, Engineering (any), Statistics, Economics, Analytics, Mathematics, Finance or related field and 5+ years of experience in data analysis, data science, statistics, or related field; OR Master's degree or equivalent in Business, Engineering (any), Statistics, Economics, Analytics, Mathematics, Finance or related field and 3+ years of experience in data analysis, data science, statistics, or related field Skills required: • #Python, #SQL, #GCP BQ and #Tableau expertise • Desire to learn client’s item catalog business. • Presentation and Verbal communication skills • Previous #Supply Chain and item catalog experience Any One Interested Please share your Updated Resume at manojd@argyllinfotech.com
7
9 Comments -
Prem Kumar
Hello Bench sales, We have a position for Risk Management - San Antonio TX (On-site) Client - TCS Job Description: We are seeking a highly skilled and experienced Senior Risk Manager to join our team at TCS in Dallas, TX, with responsibilities extending to our San Antonio office. The ideal candidate will utilize established risk management practices to address execution challenges related to quality, schedule, and costs, ensuring adherence to our risk management framework. This role demands a strategic mindset, strong program management capabilities, and proficiency in Agile DevOps for senior management. Key Responsibilities: Risk Management: Utilize established risk management practices to identify, assess, and mitigate risks associated with quality, schedule, and costs. Strategic Implementation: Drive the implementation of moderately to highly complex projects that require confidentiality and have enterprise-level visibility to achieve strategic business goals and operational objectives. Program Management: Lead and manage multiple projects and programs, ensuring they are completed on time, within budget, and to the highest quality standards. Cross-Functional Collaboration: Work closely with cross-functional teams and resources to achieve program milestones within established timeframes. Agile DevOps: Apply Agile DevOps practices to senior management processes to improve efficiency and effectiveness in project delivery. Stakeholder Communication: Maintain effective communication with stakeholders at all levels, providing updates on risk status and project progress. Continuous Improvement: Promote a culture of continuous improvement by identifying opportunities to enhance risk management practices and project outcomes. Essential Skills and Qualifications: Program Management: Demonstrated experience in managing complex programs and projects, with a strong track record of achieving strategic business goals. Agile DevOps for Senior Management: Proficiency in Agile DevOps methodologies and their application in senior management settings. Confidentiality and Enterprise-Level Visibility: Ability to handle confidential information with discretion and manage projects with significant visibility within the organization. Cross-Functional Collaboration: Proven ability to work effectively with cross-functional teams to achieve project milestones. Risk Management Framework: In-depth knowledge of risk management frameworks and practices. Desirable Skills: Strategic Goal Achievement: Experience in driving the implementation of projects that achieve strategic business goals and operational objectives. Operational Objectives: Ability to manage projects that require high levels of confidentiality and enterprise-level visibility. Communication and Influence: Strong communication skills with the ability to influence stakeholders and manage expectations effectively. Kindly share the resume to : prem.k@twsol.com #Programmanager #program #risk #riskmanager #bench
5
11 Comments -
Ankit Kalyan
Immediate Closure Roles. Please share the relevant profiles at AnkitK1@sysmind.com Type - W2 ONLY Position: MS Dynamics 365 Consultant Location: Augusta, ME (Remote) Required Experience: Code enhancement and development programs and/or required fixes to production problems using functional and technical programming standards. Test enhancement and development programs. Participate in structured code reviews/walkthroughs. Execute all required process steps. Create and provide content for operational documentation to the Team Lead. Follow quality standards. Support installation of application releases into production as directed. Communicate accurate and useful status updates. Ability to work in a team environment Analyze and design enhancements, development programs, and/or required fixes to production problems. Design applications to functional and technical programming standards. Conduct structured walk-throughs Work with Systems Analysts and Team Lead to gather and interpret user requirements into design specifications Develop system specifications and interfaces. Determine time estimates and schedule for work. Assist in managing and directing Application Team processes. Assist Team Lead or Test Team Lead in monitoring estimated-time-to-complete (ETC) and actuals for assigned tasks Develop application designs in support of the systems specifications and interfaces, perhaps in conjunction with application or technical architects. Operating System expertise sufficient to perform performance and tuning diagnostics. Work with users to ensure that solutions meet business requirements. Execution of all responsibilities with little direct supervision of Team Lead. Generally aware of new developments in industry and process and has ability to apply them to work as appropriate. Anticipate and resolve issues specific to the team. Review and understand the Application Team's workplan. Anticipate, identify, track and resolve issues and risks affecting own work and work of the Application Team. Develop contingency plans as necessary. Engage in ongoing process improvement. Detailed functional and process knowledge. Utilize deep modeling, design, and coding skills. Converts scientific, engineering, and other technical problem formulations to formats that can be processed by computer. Confers with other business and technical personnel to resolve problems of intent, inaccuracy, or feasibility of computer processing. Works with necessary personnel to determine if modifications are necessary with interested personnel to determine the necessity for modifications or enhancements. Leverages excellent written and verbal communication skills to develop new business process and programming solutions as directed by business and technical stakeholders. Experience with model-driven apps. #W2 #CONTRACT # #hiring #DYNAMICS365 #W2 #email #AUGUSTA #SOME #W2 #CRM #365 #STATECLIENT #POWERAPPS #MSD365 #MSDYNAMICS365
5
1 Comment -
Solumtochukwu Ekpe
Join me as I navigate the fascinating world of advanced SQL! Today, I delved into the magic of window functions. 🌟 Exploring #Lead and #Lag was like taking a time machine through our data—zipping forward to peek at future values and stepping back to review past ones, all through the clever use of arguments in our queries. It's a thrilling ride through time and numbers! 🚀🕒 #SQLJourney #DataTimeTravel #SQL This was the exercise i worked on 5/8/2024. Udemy Class: https://lnkd.in/gDmvV-Qt --LEAD and LAG - Exercises --Exercise 1 --Create a query with the following columns: --“PurchaseOrderID” from the Purchasing.PurchaseOrderHeader table --“OrderDate” from the Purchasing.PurchaseOrderHeader table --“TotalDue” from the Purchasing.PurchaseOrderHeader table --“Name” from the Purchasing.Vendor table, which can be aliased as “VendorName”* --*Join Purchasing.Vendor to Purchasing.PurchaseOrderHeader on BusinessEntityID = VendorID --Apply the following criteria to the query: --Order must have taken place on or after 2013 --TotalDue must be greater than $500 Select a.[PurchaseOrderID] ,a.[OrderDate] ,a.[TotalDue] ,b.Name as VendorName From [Purchasing].[PurchaseOrderHeader] as a Join [Purchasing].[Vendor] as b on a.VendorID = b.BusinessEntityID Where Year(a.[OrderDate]) > = '2013' and a.TotalDue > '500' --Exercise 2 --Modify your query from Exercise 1 by adding a derived column called ---"PrevOrderFromVendorAmt", that returns the “previous” TotalDue value (relative to the current row) within the group of all orders with the same vendor ID. We are defining “previous” based on order date. Select a.[PurchaseOrderID] ,a.[OrderDate] ,a.[TotalDue] ,b.Name as VendorName ,PrevOrderFromVendorAmt= LAG(a.TotalDue, 1) OVER(PARTITION BY a.VendorID ORDER BY a.[OrderDate]) from [Purchasing].[PurchaseOrderHeader] as a Join [Purchasing].[Vendor] as b on a.VendorID = b.BusinessEntityID Where Year(a.[OrderDate]) > = '2013' and a.TotalDue > '500' --Exercise 3 --Modify your query from Exercise 2 by adding a derived column called --"NextOrderByEmployeeVendor", that returns the “next” vendor name (the “name” field from Purchasing.Vendor) within the group of all orders that have the same EmployeeID value in Purchasing.PurchaseOrderHeader. Similar to the last exercise, we are defining “next” based on order date. Select a.[PurchaseOrderID] ,a.[OrderDate] ,a.[TotalDue] ,b.Name as VendorName ,PrevOrderFromVendorAmt = LAG(a.TotalDue, 1) OVER(PARTITION BY a.VendorID ORDER BY a.[OrderDate]) ,NextOrderByEmployeeVendor= LEAD(b.Name,1) OVER(PARTITION BY a.EmployeeID ORDER BY a.[OrderDate]) from [Purchasing].[PurchaseOrderHeader] as a Join [Purchasing].[Vendor] as b on a.VendorID = b.BusinessEntityID Where Year(a.[OrderDate]) > = '2013' and a.TotalDue > '500'
4
-
Aryan Rajesh
INCORPORAN INC is looking for Senior Data Modeler for our client in the location is REMOTE Please send me your resumes and contact details at rajeshkumar@incorporaninc.com or call Rajesh at 609-474-4722 Job Description: · The Senior Data Modeler is responsible for converting an understanding of key objects and processes in the business into a coherent digital structure, a “data model”. This data model serves as the information architecture blueprint for systems that create, store, manipulate, retrieve, and display information for various business functions. · Expert in Entity-Relationship modeling, with specific expertise in Star/Snowflake Schema modeling. · Expert in one or more enterprise-level Modeling tools, IDA, ERWin, ER-Studio, etc. · Advanced understanding of data normalization and denormalization techniques. · Expert in Metadata management and tools. · Advanced experience in Master Data management/systems and Reference Data management/systems. · Advanced experience with modeling process creation and implementation. · Expert in understanding business needs to translate those needs into an effective model. · Skilled in analyzing existing business system data, data profiling and source analysis. · Advanced experience with ETL architecture, warehouse implementations and experience with enterprise level ETL tools. · Expert in designing and implementing data quality assurance processes to ensure ETL is correctly sourcing, translating and populating data. · Advanced understanding of physical database/data-storage principles, including Google Cloud Platform as well as RDBMS used by COTS. · Advanced industry experience and understanding of key Healthcare data from EHRs, ERPs, LIS, Patient Portals, Practice Management, MPIs, etc. (Preferred but not mandatory) · Expert oral and written communications skills. · Strong interpersonal skills including the ability to lead and facilitate discussions with business and technical stakeholders from executive level leadership to daily hands-on contributors. · Expert in time-management and self-directed activity. · Working Hours: The working hours are primarily based on Central Time Zone, from 8 am to 5 pm. However, there is flexibility depending on the workload and deliverables. · Interview Process: The interview process will be conducted through Google Meet and will consist of two rounds, with a possibility of a third round if there is a lot of competition. Each interview will last a minimum of one hour. · Training: There will be Environmental Training provided which is Ascension based. · Experience: A minimum of 5 to 7 years of experience is required for this role. This kind of skills take almost a decade to build. · Interaction: The role involves interaction with multiple teams within the organization.
1
-
Bharat Mandora
Hiring Sr. Data Engineers with one of the IT consulting company that offers customized software solutions for businesses based on data, machine learning, analytics, and security. Role: Senior Data Engineer (With Any cloud platform AWS / GCP) Job Requirements: · Strong experience as a data engineer (Databricks preferred) · Expert proficiency in Spark Scala, Advanced SQL & Python (mandate), and PySpark is a plus · Must have data migration experience from on prem to cloud · Hands-on experience in Kinesis to process & analyze Streaming data, and cloud DynamoDB · In depth understanding of cloud, data lake, and analytics solutions. · Expert level hands-on development in designing and developing applications on Databricks, · Databricks Workflows, cloud Managed Airflow, and Apache Airflow is required. · Hands-on experience with the Technology stack available in the industry for data management, · data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map · Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc. · Knowledge of different programming and scripting languages · Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN] · Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair · Experience preparing data for use in SageMaker and Databricks. · Demonstrated experience preparing data, automating and building data pipelines for AI Use Cases (text, voice, image, IoT data etc.…). · Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark Scala, Spark SQL/PySpark · Strong understanding of Data Modeling and defining conceptual logical and physical data models. Responsibilities: · Work closely with team members to lead and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigation. · Manage data related requests, analyze issues, and provide efficient resolution. Design all program specifications and perform required tests · Design and Develop data Ingestion using Glue, Any cloud Managed Airflow, Apache Airflow, and processing layer using Databricks. · Work with the SMEs to implement data strategies and build data flows and prepare codes for all modules according to required specification. · Monitor all production issues and inquiries and provide efficient resolution. · Evaluate all functional requirements, map documents, and troubleshoot all development processes #Dataengineers #etldeveloper #databricksdeveloper #databricks #Python #sql #pyspark #Dataextraction #datamanipulation #hackerrankcertified #OOPS #advancesql #objectorientedprogramming #awsdataengineer #gcpdataengineer #pythoncoding #pythonprogramming #sqlquerries
32
3 Comments -
anil anasuri
WE ARE HIRING DATABASE ADIMINISTRATOR LOCATION: LANSING, MI HYBRID: 2DAYS/WEEK DURATION:12+MONTHS NOTE: NEED LOCAL CANDIDATES ONLY JOB DESCRIPTION We are looking for a DBA to join our team. They will need to have experience with formal change control environments. They also need strong troubleshooting skills. Our environment is predominantly Sybase with SQL Server. We are moving from Sybase to a pure SQL Server in approximately 3 years. The Sybase environment is 15.7 on Solaris in an X86 hardware environment within our local network. The SQL Server environment is mixed versions of SQL Server on mixed versions of Windows Server with an upcoming project to update versions. This environment includes both servers in our network and in Azure. Our planned goal is to move off Sybase and a project to do this is underway and thus the need for the selected candidate to have some experience with both Sybase and SQL Server. Our primary goal is to have an additional individual to assist with our Sybase 15.7. As a result, the appropriate candidate will need to have Unix background running and developing CRON jobs with command line experience. The DBA should be familiar with Sybase ASE 15.7. This can be at the level of a junior DBA. The desire is to have a DBA that can assist with tasks, particularly during after hours implementations that the senior DBAs need completed to reduce the overall time required per DBA. This position will also help with other tasks that occur during the week such as weekly data fixes which occur on Thursday and Friday. The secondary goal is to take work off the SQL Server DBAs. This will require the individual to have some experience with Windows Servers and with SQL Server. Additionally, the person will work with clusters and with Disaster Recovery in our development, test, and production environments. A strong desired experience is migration from Sybase to SQL Server. Individual tasks related to the duty: • Research, draft and recommend database standards and policies. • Establish standards and guidelines for database space allocation based on best practices and implementation considerations based on business requirements. anticipated workload. • Review results of database integrity checks. Resolve identified issues. • Design appropriate strategy to reorganize database objects to release unused space or repair fragmentation. • Review, install, analyze and implement security patches to be applied to remain PCI compliant, vendor compliant, and OES compliant. • Designs the security model based on set standards. Create database user accounts and schemas in production environments based on defined and approved forms and procedures. • Define/design roles. • Define/design profiles. • Recommend standard password security policies. • Implement data encryption (13+design encryption). PLEASE REACH ME AT anil@khpsolutions.com or +1 469-277-7933 #database #admin #c2c #w2 #DBA #sybase #sql #azure #2024jobs
1
1 Comment -
Shalini S.
Hi, Capital One – Full Stack Python Dev Only Local to VA. Don’t share nonlocal profiles Details: - Hybrid (1680 Capital One Dr, Mclean, VA, 22102-3407, USA) - 12 Month Contract to Hire Please share suitable resume to shalini@vstconsulting.com or contact 732-268-5768 Top Skills' Details 2 - 4 years PYTHON 2-4 years in React JS 2-4 years of AWS - Ec2, EMR, ECS, S3, SNS, Lambda, RDS, VPC Knowledgeable w/ SDLC best practices - CICD, unit testing, functional testing **Need experience working w/ legacy to serverless Job Description The Enterprise Detection group at Capital One is looking for strong full-stack Python developers (2 Sr, 1 Principal) that have robust experience with ReactJS, AWS, and CICD, as well as unit and functional testing. Must be able to quickly adapt, contribute to other initiatives. Additional Skills & Qualifications 2-4 years data warehousing and relational database experience (Snowflake, Oracle, or PostgresSQL) 2-4 years of exp. w/ Customer Services (Amazon EC2, Amazon FallGate) ENTERPRISE DETECTION Will be working on the MetaBot platform, a key piece in C1's journey to get the most value out of data. It will make data easy to govern, easy to find and easy to use. MetaBot communicates with Catalog, where all datasets need to be registered. MetaBot will solve the problem of "How do we get all different data sets in the company into the catalog, and get them registered?" MetaBot's main goal is to go out and find all unregistered data sets and bring them to the data owners for them to take action, meaning: do I want to register fully in catalog, or just clean up/create exclusion or deletion rules? There are still millions of unregistered data sets, and they're not sure who to present to. Right now, users are presented with UI with millions of results. It's so manual, error-prone, and tedious. Can't keep up with creation of new data sets. Goal for metabot is to autoregister - take out all manual intervention and make it so that whatever Metabot detects goes into Autoregistration services, fills in data, and registers it in the catalog.
3
1 Comment -
Randhir kumar
#HiringImmediate #c2c // #w2 // Contract // USA. Please share resumes to Randhir.Kumar@quantumworldit.com Role: Payments Business Analyst Location: DALLAS, TX Jd: Mandatory Skills Required: - Payments, Corporate banking product management and ISO20022- Payment expertise in Fedwire, RTP and SWIFT- Payment hub detailed understating with in-depth knowledge of ISO 20022 compliance requirements (MT to MX message mapping, new Fedwire tax payment ISO requirement, interfaces with OFAC and AML compliance systems) Business team looking for BA?s for a specific initiative related to CAD (Cash against Document), our tool to support adjustment processing for various payment systems and processes. The operations team is looking for support to evaluate the current adjustment scenarios, better understand the root causes, define target state... so they would need BAs with strong business analytic skills and understanding of accounting and banking money movements. Business team is hoping for candidates with business expertise with some back-office/operations understanding more than technical expertise. They need to be able to understand the reason we are using CAD (impact to accounting, issues with exception processing, issues with coexistence) and help develop plans for solutions. The work will start with an analysis of when and why CAD is used for multiple work streams from a business standpoint and then tie into the technical reasons why the work has to be done in CAD vs the core system. Below are some of the things that were planned to do and achieve for this project Activities: Assess broad range of CAD system adjustment scenarios, bifurcating appropriate and inappropriate usage of the tool for BAU activities. ? Prioritize and identify high impact / priority CAD adjustment scenarios, document their root causes and define target state solutions to resolve CAD usage in the future for those in scope components? Build a repeatable framework to propagate this current state analysis and target state resolution path approach for additional CAD adjustment scenarios no tin scope for this phase (artifacts, steps, etc..)? Identify additional quick wins across CAD and/or other stakeholder groups, if available Deliverables: Prioritization matrix on CAD components in scope? Automation and process enhancement assessment? Framework for CAD adjustment assessment
1
1 Comment -
Ravikumar Danda
Share resumes to d.ravikumar@canopyone.com NOTE: NEED CANDIDATES #LOCAL TO #DENVER, CO ONLY Position: #PowerBIDeveloper Location: #Denver, #Colorado #Hybrid - Candidate must be within #50 miles Requirements Gathering Collaborates with the business and with data analysts to understand requirements and business objectives for relevant reports & visualizations as well as underlying data model. Capable of identifying relevant data sources and attributes for use in model #DataModeling: Designing and documenting efficient conceptual, logical, and physical data models to support business requirements. Understands how to deliver row and column-level security within the Power BI semantic model. Needs to be able to establish re-usable semantic data models within Power BI. Familiarity with running tabular Editor Best Practice Analyzer for adhering to modeling and DAX best practices. Creating relationships between tables and optimizing data structures. #DataSourceConnectivitySetup: Connecting to various data sources (APIs, Big Query, ArcGIS map and feature layers, SQL databases, Excel files, etc.) including streaming datasets. Ensuring seamless data extraction and transformation. Collaborating with the members of OIT Integrations team to ascertain available integration patterns. #DAXFormulas: Proficiency in writing #DAX (Data Analysis Expressions) formulas. Calculated columns, measures, and custom calculations. Report Creation: Designing and developing interactive and visually appealing reports and dashboards. Incorporating slicers, visuals, and drill-through #functionality, including streaming and #ArcGIS visuals. Needs to be cognizant of and adhere to accessibility standards (section 508). Capable of optimizing Power BI reports or making suggestions to underlying data model for performance, including the use of #DAXStudio and built in #PowerBI performance tools. It is expected that the developer will follow organizational patterns for source and version control for #PowerBI files and be able to utilize #AzureDevOps as appropriate within that process.
3
3 Comments -
Md Arfin
Hi Vendors, I'm updating my vendor list to share daily C2C requirements with direct clients and implementation partners. If you'd like to receive these opportunities, please drop your email ID in the comments below. I'll add you to my distribution list for regular updates. #C2C #DailyRequarments #Vendorlist
43
489 Comments
Explore collaborative articles
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
Explore MoreOthers named Arun Pattipaka
2 others named Arun Pattipaka are on LinkedIn
See others named Arun Pattipaka