Sharing the updated list of #CONSULTANTS who have been on projects in the past and are now actively looking for new projects. Please reach out to any of our below contacts with your requirements - Dilip - dilip@nam-it.com - Phone +1 732-743-8373 Lisha - lisha@nam-it.com - Phone +1 732-993-7485 John - john@nam-it.com - Phone +1 732-993-5322 #c2crequirements #usrecruitment #jobsearch #javajobs #hadoop #bigdata #automationengineer #uideveloper #datanalytics #cloudcomputing #saphybris #saphana #oracleebs #oraclefinancials #dotnetdevelopers #awssolutionarchitect #saphcm #sharepointdeveloper #informatica #dataengineer #javafullstack #angularjobs #scrummaster #frontenddeveloper #qamanager #recruitment #sapbi #datasciencecareers #devopsjobs #devops #hotlist #benchlist Vinay Mahajan B.J. Venkatesh Abhimanyu Diwaker Shyam Valloornatt Balaji Krishnamoorthy Srikanth M Srini V S Venki R Lisha Akshay Smital Dhavane Divya Shetty Pooja K Praveen R Gowda B J Lokesh Thirupathi S Jawaharlal Nehru G Arvind Sharma K Manjunath MS Chandra M Shilpa Pradeep Abishai c Aravinth C Benjamin Samuel ITServe Alliance Punjabi Chamber of Commerce
NAM Info Inc’s Post
More Relevant Posts
-
Hello #everyone, Greetings! I'm #hiring for our Client's #requirement projects, get in touch if anyone interested for this project in #madrid, #Spain location. #Language #Spanish and #English #contract duration 12+ #months #hybrid based in #Madrid , Spain #Role 1: #Data Architect: Experience in designing, defining and architecturing large data migration A comprehensive understanding of data warehousing and data transformation (extract, transform and load) processes and the supporting technologies such as #Amazon Glue, #EMR, #Azure #Data #Factory (#adf), Data Lake, other analytics products Proven experience in #architecting and #implementing Business Intelligence and Data warehouse platforms, Master data Management, data integration and OLTP database solutions. Designing architecture solutions that are in line with long-term business objectives Excellent problem solving and data modelling skills #Role 2: #Informatica #Developers: Informatica developers to analyse a #business's database storage and #warehousing capabilities and assess the company's data requirements Mapping in Informatica collection´s of source and target objects linked together by a set of transformations (ETL). 5-7+ years of hands on experience with /designing/development of informatica applications. Good hands on experience in below informatica concepts: Performance tuning, Data modeling and data #warehousing concepts, Reusable transformations, Partition techniques Strong database knowledge #Oracle PL/SQL, tuning Expert in writing Oracle procedures, functions, packages and debugging the code. Provide technical support for issues including hands-on #troubleshooting. Informatica -Mappings, #Workflows, Sessions, Reusable components, Parameters and Variable, #Performance tuning #Role 3: #DataMigration Specialist (#Functional understanding) migration strategy, IFF specifications, data migration scope, workstream plan, data migration #mappings Load data into the deployed solution using tools built on #SQL Server Integration Services Excellent skill of understanding the #data models, object and mapping them with target #schema Excellent #ETL and skill with the knowledge of #Informatica and other tolls like Talend etc.. Converting database objects, applying code to #PostgreSQL schema, creating DMS users and modifying DMS tasks. #Migrating data, fixing defects. #AWS DMS SQL,Oracle DB, PostgreSQL DB Informatica -Mappings, Workflows, Sessions, Reusable components, Parameters and #Variable, Performance #tuning #Share with me your updated CV at sonu.kumar@staffinprime.com
To view or add a comment, sign in
-
ETL Developer at UOB || EX- Zensar Technologies|| SQL || UNIX || Informatica || Oracle || Control - M
Creating an iDoc Mapping in Informatica Cloud Data Integration: #etldeveloper #informaticadeveloper #informaticapowercenter #iics #cloud #clouddataintegration #mapping #idoc #informatica #scdtype2 #etltools #sqldeveloper #oracle #database #datastage #infosys #teradata #hcl #hcltech #capgemini #cgi #techmahindra #wipro #tech #massmutual #snowflake #etltesting #etljobs #Deloite #data
To view or add a comment, sign in
-
-
SQL Transformation in Informatica Cloud is used to call a Stored Procedure or Function, or execute SQL queries midstream in a mapping pipeline. It can be used both as a connected and unconnected transformation. 🌟 Connected SQL Transformation: A Connected SQL transformation is an inline transformation which comes in the flow of the mapping. It can be used to: ✅ Run stored procedures as data flows through the transformation. ✅ Pass parameters to the stored procedure and receive single or multiple output parameters. 🌟 Unconnected SQL Transformation: An Unconnected SQL transformation is not connected to any transformation in a mapping and can be called using :SP expression. An Unconnected SQL transformation provide additional capabilities: ✅ Execute stored procedures before or after a mapping. ✅ Run a stored procedure once during a mapping. ✅ Conditionally run stored procedures using IIF statements.. ✅ Call stored procedures multiple times within a single mapping. 📝 Checkout the complete article on SQL Transformation: https://lnkd.in/g6RVWG-P 🔴 Here is the link to YouTube video on SQL Transformation: https://lnkd.in/giaZekjE #ThinkETL #Informatica #InformaticaCloud #SQLTransformation #DataIntegration #StoredProcedures #ETL #IICS
To view or add a comment, sign in
-
Job Description: "Minimum 4 years of experience on Oracle's Cloud-based analytics platforms including OAC/ADW/ODI and/or FAW. Strong hands-on expertise in OAC including Analytics, Data Visualization, and Semantic Model Development. Very good development experience in OAC-Reports and dashboards using measures, Filters, calculated measures, calculated items etc Must be able to do Report testing process Experience migrating from OBIEE to OAC. Experience migrating between OAC Instances. Very Good Understanding of DatawareHousing Concepts and Data Warehouse modeling. Thorough handson experience on SQL(on any RDBMS Source). Able to troubleshoot report errors and issues on OAC. Hands On knowledge on Building, Analysis and visualizations based on Datasets created using SQL or Excel Data Sources. Good Knowledge on RPD Modeling and Usage of Data modelers on OAC. Able to troubleshoot report errors and issues on OBIEE/OAC and understand the tool limitations for OAC. Should have experience in performance tuning OAC Analysis, this includes analyzing the Explain Plan of the query, tuning the data model as well as making modifications to the tables such as indexing. Should have good knowledge of Coding, Debugging and Design and Documentation. Understanding of the flow of data between ERP and Data Warehouse. Preferable to Model and Build BI Publisher Reports. Any knowledge on PLSQL/ODI/Any ETL Tool would be preferable. Working on Multidimensional sources (like Essbase) is a plus. Any work on OTBI will be a plus. Expertise on the Oracle Analytics Cloud Tool Knowledge on BIApps concepts is preferable. Familiar with Upgrade activities and Issues encountered during Upgrade from OBIEE to OAC. Expertise in SQl/Knowledge of any ETL tool is preferable. Knowledge on FAW (ERP and SCM)/ADW/OAC (Classic, Data Visualization, Semantic Model Development)/ODI is plus. Having knowledge on IOT,Blockchain is preferable. Use feedback and reflection to develop self-awareness, personal strengths and address development areas. Proven track record as an SME in chosen domain. Ability to come up with Client POC/POV for integrating/increasing adoption of emerging Tech. like BlockChain, AI et al with the product platform they are associated with. Mentor Junior resources within the team, conduct KSS and lessons learnt. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review. Status Reporting for the project. Escalation/Risk management. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy and relevance. Additional Information: Mandatory Skills -FAW with BI Experience Nice to have skills -ODI How to Apply: Interested candidates are invited to submit their resume and cover letter to uma@vividtechnosolutions.com.
To view or add a comment, sign in
-
#Integrating PLSQL and Informatica for #Seamless #Data #Processing and #Transformation In the world of #data management, achieving seamless integration between various tools is key to building robust and efficient data solutions. One such powerful combination is #integrating #PLSQL with #Informatica. By leveraging the strengths of both #PLSQL and #Informatica, we can create a seamless data processing and transformation pipeline. PLSQL’s robust database management capabilities paired with Informatica’s powerful ETL processes allow for the efficient handling of complex data workflows. Here's why this synergy is so impactful: #Efficiency: Combining PLSQL's direct database interactions with Informatica's streamlined ETL processes reduces data handling time and improves overall efficiency. #Scalability: This integration supports large-scale data operations, ensuring that as data grows, the system remains performant and reliable. #Flexibility: Informatica’s flexible mapping and transformation tools, combined with PLSQL’s powerful querying and scripting capabilities, provide a versatile solution for various data scenarios. #Data Quality: Enhanced data validation and transformation techniques ensure high data quality, crucial for accurate analytics and reporting. In my recent project, integrating these tools allowed us to streamline our data workflows, significantly reducing ETL runtime and ensuring data accuracy and consistency. This approach not only enhanced performance but also provided a scalable solution for future data growth. The synergy between PLSQL and Informatica is indeed incredible for crafting robust data solutions. Looking forward to exploring more integration techniques and hearing about your experiences with these tools! #PLSQL #Informatica #DataIntegration #DataProcessing #ETL #DataManagement #TechiePosts #humanresources #hr #jobinterviews #hiringandpromotion #jobalert #nowhiring #job #gethired #jobopening #jobfair #recruiting #jobopening #hiring #joinourteam #jobs #jobhirin #remotework #jobsearch #jobsearching #jobseekers #workingathome #hire #opentowork #jobsearch #hireme #jobhunt #jobseeker #hiring #recruitment #technology #HumanResources #humanresources #jobinterviews #hiringandpromotion #jobsearch #jobseekers #jobopening #workingathome #recruiting #job #hiring #deeplearning #homeoffice #culture #plsql #database #oracle #sqlserver #mssqlserver
To view or add a comment, sign in
-
-
Hello Folks, Hope you are doing good😊! I would like to share one of my Solid resources who is a Strong Oracle PL/SQL Developer who is looking project only in Southern California and she can work Day1 onsite and she is available immediately for next venture Immediately and she is actively looking for immediate opportunities. Over 7+ years of experience in the development of Application Software Design, Development and Management on Oracle 19c, 12c, 11g, 10g, SQL Server, Snowflake, DBT, WebLogic, JEUS, WebtoB, Airflow, Informatica, SharePoint, C#, Crystal Reports/Crystal Enterprise 9.0/10.0/XI, SSIS, SSRS, Business Objects, MicroStrategy, Tableau and Power BI. Involved in Design phases of the projects and Well experience in leading Database and Application teams. Efficient in designing, developing Conceptual, Logical and Physical Data model, ERD Diagrams following Data Integrity, Redundancy, Transparency, Auditability, Accountability, Standardization, MDM, CDM, Risk Managements and other Data Governance principles. Proficient in implementing Normalization and De-Normalization techniques on Relational (OLTP) and Dimensional (OLAP) Datamodels. Well experienced in designing and developing Data warehouse in Star Schema, Snowflake Schema, Multi Start Schema, Confirmed Dimensions, Slowly Changing Dimensions and Datamarts. Strong RDBMS concepts and well experience in creating database objects like Tables, Views, Sequences, triggers, Collections, etc. taking the Performance and Reusability into consideration. Very proficient in writing Stored Procedures, Standalone Functions, Nested Functions, building Packages and developing Public and Private Sub-Programs using PLSQL, TSQL and providing Documentation. Optimal in performing Database Tuning in reducing the Cost and Cardinality of the SQL Queries by using various Hints and Optimizers on looking at the Explain Plan. Improving the Performance of the SQL Queries, Functions, Procedures, Packages and Database Objects by creating Normal, Unique and Bitmap Indexes. Optimal Experience on WebLogic 8.x, 9.x, 10.x,11 Server JEUS 7.0, WebtoB, Apache http Server, Apache Tomcat Server as administrator for Server Installations, Node Manager Configuration, Application components deployment, Cluster Management, Load Balancing, performance tuning including troubleshooting and maintenance. Migrated Data from On-Prem Oracle to AWS Cloud via S3 Buckets using Airflow and DBT on Snowflake. Proficient in designing Metadata based ETL Process using Airflow and DBT products for Generic Data Movement. Designed end to end Pipelines on Airflow Integrating DBT Models with Centralized Notification process. Designed and Developed SQL based Incremental, Snapshot models on DBT to ingest data into Data Warehouse Involved in configuring Multi Cluster Scalable Warehouses sizes on Snowflake Please let me know if you have any C2C roles with you? Or you can reach me on 732-666-0061- or email me on jagadish@aekyainc.com
To view or add a comment, sign in