Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
8 - 12 Lacs
Hyderabad, Pune
Work from Office
Sr MuleSoft Developer1 Design and implement MuleSoft solutions using AnyPoint Studio, Mule ESB, and other related technologies.Collaborate with cross-functional teams to gather requirements and deliver high-quality ETL processes.Develop and maintain APIs using RAML and other industry standards.Strong understanding of RAML (REpresentational API Modeling Language) and its usage in API design.Develop complex integrations between various systems, including cloud-based applications such as Snowflake.Ensure seamless data flow by troubleshooting issues and optimizing existing integrations.Provide technical guidance on best practices for data warehousing, ETL development, and PL/SQL programming language.Strong understanding of SQL concepts, including database schema design, query optimization, and performance tuning.Proficiency in developing complex ETL processes using various technologies such as Cloud platforms and Data Warehousing tools (Snowflake).Experience working with multiple databases and ability to write efficient PL/SQL code snippets.
Posted 1 month ago
15.0 - 20.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data architecture and platform functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with the latest trends and technologies in data platforms.- Assist in the documentation of data architecture and integration processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and methodologies.- Experience with cloud-based data solutions and architectures.- Familiarity with data modeling concepts and practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 2 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Hybrid
Shift : (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Data Engineering, Big Data Technologies, Hadoop, Spark, Hive, Presto, Airflow, Data Modeling, Etl development, Data Lake Architecture, Python, Scala, GCP Big Query, Dataproc, Dataflow, Cloud Composer, AWS, Big Data Stack, Azure, GCP Wayfair is Looking for: About the job The Data Engineering team within the SMART org supports development of large-scale data pipelines for machine learning and analytical solutions related to unstructured and structured data. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct impact on all applications that our millions of customers interact with every day: search results, homepage content, emails, auto-complete searches, browse pages and product carousels and build and scale data platforms that enable to measure the effectiveness of wayfairs ad-costs , media attribution that helps to decide on day to day or major marketing spends. About the Role: As a Data Engineer, you will be part of the Data Engineering team with this role being inherently multi-functional, and the ideal candidate will work with Data Scientist, Analysts, Application teams across the company, as well as all other Data Engineering squads at Wayfair. We are looking for someone with a love for data, understanding requirements clearly and the ability to iterate quickly. Successful candidates will have strong engineering skills and communication and a belief that data-driven processes lead to phenomenal products. What you'll do: Build and launch data pipelines, and data products focussed on SMART Org. Helping teams push the boundaries of insights, creating new product features using data, and powering machine learning models. Build cross-functional relationships to understand data needs, build key metrics and standardize their usage across the organization. Utilize current and leading edge technologies in software engineering, big data, streaming, and cloud infrastructure What You'll Need: Bachelor/Master degree in Computer Science or related technical subject area or equivalent combination of education and experience 3+ years relevant work experience in the Data Engineering field with web scale data sets. Demonstrated strength in data modeling, ETL development and data lake architecture. Data Warehousing Experience with Big Data Technologies (Hadoop, Spark, Hive, Presto, Airflow etc.). Coding proficiency in at least one modern programming language (Python, Scala, etc) Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing and query performance tuning skills of large data sets. Industry experience as a Big Data Engineer and working along cross functional teams such as Software Engineering, Analytics, Data Science with a track record of manipulating, processing, and extracting value from large datasets. Strong business acumen. Experience leading large-scale data warehousing and analytics projects, including using GCP technologies Big Query, Dataproc, GCS, Cloud Composer, Dataflow or related big data technologies in other cloud platforms like AWS, Azure etc. Be a team player and introduce/follow the best practices on the data engineering space. Ability to effectively communicate (both written and verbally) technical information and the results of engineering design at all levels of the organization. Good to have : Understanding of NoSQL Database exposure and Pub-Sub architecture setup. Familiarity with Bl tools like Looker, Tableau, AtScale, PowerBI, or any similar tools.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai, Ahmedabad
Work from Office
We are looking for a skilled and detail-oriented SAS Developer with 35 years of experience, proficient in SAS Visual Analytics (VA), Visual Investigator (VI), and Data Integration (DI). The candidate will work on high-impact projects for international clients, supporting solutions across business domains such as banking, financial services, and insurance. The ideal candidate should be open to working in international time zones when assigned to remote projects. Key Responsibilities: Develop, enhance, and maintain SAS solutions using SAS VA, SAS VI, and SAS DI. Perform data extraction, transformation, and loading (ETL) processes using SAS DI Studio. Create interactive dashboards and reports using SAS Visual Analytics. Collaborate with business analysts, project managers, and end users to gather requirements and deliver technical solutions. Troubleshoot and optimize existing SAS code and processes for performance and scalability. Ensure data quality and integrity in reporting and analysis tasks. Support deployment, testing, and validation of SAS components. Work independently or as part of a team for global delivery in international client engagements. Follow best practices in documentation, version control, and development standards. Qualifications: 3 to 5 years of hands-on experience in SAS development. Strong experience in SAS VA (Visual Analytics), SAS VI (Visual Investigator), and SAS DI (Data Integration). Good understanding of data warehousing concepts and ETL development. Familiarity with SQL and database platforms like Oracle, Teradata, or SQL Server. Excellent problem-solving skills and attention to detail. Strong communication and client interaction skills. Ability to work in international time zones (e.g., US, UK, or Middle East) when assigned remote projects. Bachelor's degree in Computer Science, Information Systems, or related field. Good to Have: Experience working in banking or credit risk domains. Exposure to cloud-based SAS solutions (e.g., SAS Viya). Remote: Open for international client projects (must be flexible with working hours) Joining: Immediate to 30 days preferred
Posted 1 month ago
8.0 - 10.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Position: Solution Architect (ETL) Location: Bangalore Experience: 8 Yrs CTC: As per the Industry standards Immediate Joiners # Job Summary We are seeking an experienced Solution Architect (ETL) to design and implement data integration solutions using ETL (Extract, Transform, Load) tools. The ideal candidate will have a strong background in data warehousing, ETL, and data architecture. # Key Responsibilities 1. Design and Implement ETL Solutions: Design and implement ETL solutions using tools such as Informatica PowerCenter, Microsoft SSIS, or Oracle Data Integrator. 2. Data Architecture: Develop and maintain data architectures that meet business requirements and ensure data quality and integrity. 3. Data Warehousing: Design and implement data warehouses that support business intelligence and analytics. 4. Data Integration: Integrate data from various sources, including databases, files, and APIs. 5. Data Quality and Governance: Ensure data quality and governance by implementing data validation, data cleansing, and data standardization processes. 6. Collaboration: Collaborate with cross-functional teams, including business stakeholders, data analysts, and IT teams. 7. Technical Leadership: Provide technical leadership and guidance to junior team members. # Requirements 1. Education: Bachelors degree in Computer Science, Information Technology, or related field. 2. Experience: Minimum 8 years of experience in ETL development, data warehousing, and data architecture. 3. Technical Skills: ETL tools such as Informatica PowerCenter, Microsoft SSIS, or Oracle Data Integrator. Data warehousing and business intelligence tools such as Oracle, Microsoft, or SAP. Programming languages such as Java, Python, or C#. Data modeling and data architecture concepts. 4. Soft Skills: Excellent communication and interpersonal skills. Strong problem-solving and analytical skills. Ability to work in a team environment and lead junior team members. # Nice to Have 1. Certifications: Certifications in ETL tools, data warehousing, or data architecture. 2. Cloud Experience: Experience with cloud-based data integration and data warehousing solutions. 3. Big Data Experience: Experience with big data technologies such as Hadoop, Spark, or NoSQL databases. # What We Offer 1. Competitive Salary: Competitive salary and benefits package. 2. Opportunities for Growth: Opportunities for professional growth and career advancement. 3. Collaborative Work Environment: Collaborative work environment with a team of experienced professionals.
Posted 1 month ago
5.0 - 8.0 years
9 - 13 Lacs
Mumbai
Work from Office
Skill required: Data Management - AWS Architecture Designation: Data Eng, Mgmt & Governance Sr Analyst Qualifications: BE/BTech Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIIn this role, you will be responsible for designing, developing, implementing, and managing distributed applications and systems on the AWS platform. You will be responsible for ETL development, data analysis, technical design, and testing on AWS environment. What are we looking for AWS Python (Programming Language) PySpark Adaptable and flexible Ability to work well in a team Strong analytical skills Commitment to quality Agility for quick learning Roles and Responsibilities: In this role you are required to do analysis and solving of increasingly complex problems Your day-to-day interactions are with peers within Accenture You are likely to have some interaction with clients and/or Accenture management You will be given minimal instruction on daily work/tasks and a moderate level of instruction on new assignments Decisions that are made by you impact your own work and may impact the work of others In this role you would be an individual contributor and/or oversee a small work effort and/or team Qualification BE,BTech
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will collaborate with teams to ensure successful project delivery and application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular team meetings to discuss progress and challenges- Stay updated on industry trends and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Strong understanding of ETL processes- Experience with data integration and data warehousing- Knowledge of data quality and data governance principles- Hands-on experience with Ab Initio GDE and EME tools Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Chennai office- A 15 years full time education is required Qualification 15 years full time education
Posted 1 month ago
2.0 - 7.0 years
4 - 8 Lacs
Ahmedabad
Work from Office
Travel Designer Group Founded in 1999, Travel Designer Group has consistently achieved remarkable milestones in a relatively short span of time. While we embody the agility, growth mindset, and entrepreneurial energy typical of start-ups, we bring with us over 24 years of deep-rooted expertise in the travel trade industry. As a leading global travel wholesaler, we serve as a vital bridge connecting hotels, travel service providers, and an expansive network of travel agents worldwide. Our core strength lies in sourcing, curating, and distributing high-quality travel inventory through our award-winning B2B reservation platform, RezLive.com. This enables travel trade professionals to access real-time availability and competitive pricing to meet the diverse needs of travelers globally. Our expanding portfolio includes innovative products such as: * Rez.Tez * Affiliate.Travel * Designer Voyages * Designer Indya * RezRewards * RezVault With a presence in over 32+ countries and a growing team of 300+ professionals, we continue to redefine travel distribution through technology, innovation, and a partner-first approach. Website : https://www.traveldesignergroup.com/ Profile :- ETL Developer ETL Tools - any 1 -Talend / Apache NiFi /Pentaho/ AWS Glue / Azure Data Factory /Google Dataflow Workflow & Orchestration : any 1 good to have- not mandatory Apache Airflow/dbt (Data Build Tool)/Luigi/Dagster / Prefect / Control-M Programming & Scripting : SQL (Advanced) Python ( mandatory ) Bash/Shell (mandatory) Java or Scala (optional for Spark) -optional Databases & Data Warehousing MySQL / PostgreSQL / SQL Server / Oracle mandatory Snowflake - good to have Amazon Redshift - good to have Google BigQuery - good to have Azure Synapse Analytics - good to have MongoDB / Cassandra - good to have Cloud & Data Storage : any 1 -2 AWS S3 / Azure Blob Storage / Google Cloud Storage - mandatory Kafka / Kinesis / Pub/Sub Interested candidate also share your resume in shivani.p@rezlive.com
Posted 1 month ago
7.0 - 9.0 years
12 - 15 Lacs
Hyderabad
Work from Office
We are seeking an experienced ETL Developer with a strong background in Python and Airflow to join our dynamic team in Hitech City, Hyderabad. The ideal candidate will have over 7 years of experience in ETL processes and data integration, with a focus on optimizing and enhancing data pipelines. While expertise in Snowflake is not mandatory, a strong understanding of RDBMS and SQL is essential.
Posted 1 month ago
5.0 - 8.0 years
25 - 30 Lacs
Pune, Gurugram, Bengaluru
Work from Office
NYU Manager - Owais UR Delivery Manager - Laxmi Title: Senior Data Developer with Strong MS/Oracle SQL, Python Skills and Critical Thinking Description: The EDA team seeks a dedicated and detail-oriented Senior Developer I to join our dynamic team. The responsibility of the successful candidate will be to handle repetitive technical tasks, such as Healthy Planet MS SQL file loads into a data warehouse, monitor Airflow DAGs, manage alerts, and rerun failed processes. Additionally, the role will require the analyst to monitor various daily and weekly jobs, which may include generation of revenue cycle reports and data delivery to external vendors. The perfect candidate will have a robust experience with MS/Oracle SQL, Python, Epic Health Systems, and other relevant technologies. Overview: As a Senior Developer I at NYU EDA team, you will play a vital role to improve the operation of our data load and management processes. Your primary responsibilities will be to ensure the accuracy and timeliness of data loads, maintain the health of data pipelines, and monitor that all scheduled jobs are completed successfully. You will collaborate with cross-functional teams to identify and resolve issues, improve processes, and maintain a high standard of data integrity. Responsibilities: Manage and perform Healthy Planet file loads into a data warehouse. Monitor Airflow DAGs for successful completion, manage alerts, and rerun failed tasks as necessary. Monitor and oversee other daily and weekly jobs, including FGP cash reports and external reports. Collaborate with the data engineering team to streamline data processing workflows. Develop automation scripts to reduce manual intervention in repetitive tasks using SQL and Python. Ensure all data-related tasks are performed accurately and on time. Investigate and resolve data discrepancies and processing issues. Prepare and maintain documentation for processes and workflows. Conduct periodic data audits to ensure data integrity and compliance with defined standards. Skillset Requirements: MS/Oracle SQL Python Data warehousing and ETL processes Monitoring tools such as Apache Airflow Data quality and integrity assurance Strong analytical and problem-solving abilities Excellent written and verbal communication Additional Skillset Familiarity with monitoring and managing Apache Airflow DAGs. Experience: Minimum of 5 years experience in a similar role, with a focus on data management and process automation. Proven track record of successfully managing complex data processes and meeting deadlines. Education: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Certifications: Epic Cogito MS/Oracle SQL, Python, or data management are a plus.
Posted 1 month ago
3.0 - 8.0 years
20 - 30 Lacs
Hyderabad, Pune
Hybrid
Job Summary: oin our team and what well accomplish together As an MDM Developer, you will be responsible for implementing and managing Master Data Management (MDM) projects. The ideal candidate will have extensive experience with Informatica MDM and proficiency in configuring MDM tools and integrating them with cloud environments. You will utilize your expertise in data engineering to build and maintain data pipelines, optimize performance, and ensure data quality. You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers. Our development team uses a range of technologies to get the job done including ETL and data quality tools from Informatica, streaming via Apache NiFi, google native tools on GCP (Dataflow, Composer, Big Query, etc.). We also do some API design and development with Postman and Node.js You will be part of the team building data pipelines that support our marketing, finance, campaign and Executive Leadership team as well as implementing Informatica Master Data Management (MDM) hosted on Amazon Web Services (AWS). Specifically you’ll be building pipelines that support insights to enable our business partners’ analytics and campaigns. You are a fast learner, highly technical, passionate person looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice. Here’s how Learn new skills & advance your data development practice Analyze and profile data Design, develop, test, deploy, maintain and improve batch and real-time data pipelines Assist with design and development of solution prototypes Support consumers with understanding the data outcomes and technical design Collaborate closely with multiple teams in an agile environment What you bring You are a senior developer with 3+ years of experience in IT platform implementation in a technical capacity Bachelor of Computer Science, Engineering or equivalent Extensive experience with Informatica MDM (Multi-Domain Edition) version 10 Proficiency in MDM configuration, including Provisioning Tool, Business Entity Services, Customer 360, data modeling, match rules, cleanse rules, and metadata analysis Expertise in configuring data models, match and merge rules, database schemas, and trust and validation settings Understanding of data warehouses/cloud architectures and ETL processes Working SQL knowledge and experience working with relational databases and query authoring (SQL), as well as working familiarity with a variety of databases Experience with the Google Cloud Platform (GCP) and its related technologies (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards, Airflow, BigQuery, BigTable, Python, BQ SQL, Dataplex, Datastream etc.) Experience with Informatica IDQ/PowerCenter/IICS, Apache NiFi and other related ETL tools Experience with Informatica MDM (preferred) but strong skills in other MDM tools still an asset Experience working with message queues like JMS, Kafka, PubSub A passion for data quality Great-to-haves Experience with Informatica MDM SaaS Experience with Python and software engineering best practices API development using Node.js and testing using Postman/SoapUI Understanding of TMF standards
Posted 1 month ago
3.0 - 5.0 years
22 - 25 Lacs
Bengaluru
Work from Office
Job Description: We are looking for energetic, self-motivated and exceptional Data engineer to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into database from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business process and relationship of systems and applications. Must be comfortable conversing with the end-users. Must have ability to manage multiple projects/clients simultaneously. Excellent analytical, verbal and communication skills. Role and Responsibilities: Work with business stakeholders and build data solutions to address analytical & reporting requirements. Work with application developers and business analysts to implement and optimise Databricks/AWS-based implementations meeting data requirements. Design, develop, and optimize data pipelines using Databricks (Delta Lake, Spark SQL, PySpark), AWS Glue, and Apache Airflow Implement and manage ETL workflows using Databricks notebooks, PySpark and AWS Glue for efficient data transformation Develop/ optimize SQL scripts, queries, views, and stored procedures to enhance data models and improve query performance on managed databases. Conduct root cause analysis and resolve production problems and data issues. Create and maintain up to date documentation of the data model, data flow and field level mappings. Provide support for production problems and daily batch processing. Provide ongoing maintenance and optimization of database schemas, data lake structures (Delta Tables, Parquet), and views to ensure data integrity and performance
Posted 1 month ago
4.0 - 8.0 years
0 - 1 Lacs
Bengaluru
Remote
Offshore Senior Developer: o Performs detailed design of complex applications and complex architecture components o May lead a small group of developers in configuring, programming, and testing o Fixes medium to complex defects and resolves performance problems o Accountable for service commitments at the individual request level for in-scope applications o Monitors, tracks, and participates ticket resolution for assigned tickets o Manages code reviews and mentors other developers Skill/Experience/Education Mandatory Skills Google Big Query development; ETL; SQL, Linux (Preferable); SSIS package building & troubleshooting; advanced data modeling
Posted 1 month ago
5.0 - 8.0 years
10 - 20 Lacs
Chennai, Bengaluru
Work from Office
ODI Developer Chennai/Bangalore. WFO only. 5-8 years of experience as an ETL Developer, with hands-on expertise in Oracle Data Integrator (ODI). ODI expertise is must, ETL with informatica and any other tools except ODI will be rejected. Proficiency in Oracle Database and MySQL, with strong skills in SQL and PL/SQL development. Experience in data integration, transformation, and loading from heterogeneous data sources. Strong understanding of data modeling concepts and ETL best practices. Familiarity with performance tuning and troubleshooting of ETL processes. Knowledge of scripting languages (e.g., Python, JavaScript) for automation is a plus. Excellent analytical and problem-solving skills. Strong communication skills to work effectively with cross-functional teams. Please call varsha 7200847046 for more Info Regards varsha 7200847046
Posted 1 month ago
5.0 - 10.0 years
7 - 17 Lacs
Chennai, Bengaluru
Work from Office
5+years of experience as an ETL Developer, with hands-on expertise in (ODI). Proficiency in Oracle Database and MySQL, with strong skills in SQL & PL/SQL Experience in data integration, transformation, and loading from heterogeneous data sources.
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Pune
Work from Office
Develop and optimize Big Data solutions using Apache Spark. Work extensively with PySpark and Data Engineering tools. Handle real-time data processing using Kafka and Spark Streaming. Design and implement ETL pipelines and migrate workflows to Spark. Required Candidate profile Hands-on experience with Hadoop, HDFS, YARN. Strong programming skills in Scala, Java, and Python. Exposure to CI/CD automation for Big Data workflows.
Posted 1 month ago
6.0 - 11.0 years
30 - 35 Lacs
Hyderabad, Delhi / NCR
Hybrid
Support enhancements to the MDM and Performance platform Track System Performance Troubleshoot issues Resolve production issues Required Candidate profile 5+ years in Python and advanced SQL including profiling, refactoring Experience with REST API and Hands on AWS Glue EMR etc Experience with Markit EDM or Semarchy or MDM will be plus
Posted 1 month ago
8.0 - 13.0 years
18 - 27 Lacs
Hyderabad
Hybrid
We are looking for an experienced ETL developer who will be responsible for the whole lifecycle of assigned tasks of ETL pipelines creation and maintenance from the concept to PROD release including: data analysis and requirements elicitation, implementing data pipelines, testing, gathering approvals and migrating code. Ideal fit will be for self-organized and result oriented person who can work without supervision Ideal candidate should have 10+ years of experience in ETL/DB/DWH/Business Intelligence spheres. Required skills: Strong SQL skill. Ability to read, modify and create complex queries. Combined with good business analysis skills, be able to interpret, judge data quality, and summarize information accordingly. Experience in developing ETL / building data pipelines with ETL tools or programming languages. Preferably experienced in multiple DW/BI aspects from ETL, Data Access Control, Data quality to Metadata management & data governance Excellent independent decision-making capabilities and a solution-oriented attitude. Ability to prioritize own tasks and manage dates Nice to have: Advanced knowledge of Google Cloud BigQuery Profound knowledge of Google Cloud Platform Passing knowledge of Python + Apache Beam/Google Cloud Dataflow will be a plus.
Posted 1 month ago
4.0 - 6.0 years
10 - 16 Lacs
Kolkata, Pune, Mumbai (All Areas)
Hybrid
JOB TITLE: Software Developer II: Oracle Data Integrator (ODI)OVERVIEW OF THE ROLE:We are looking for an experienced Oracle Data Integrator (ODI) and Oracle Analytics Cloud(OAC) Consultant to join our dynamic team. You will be responsible for designing, implementing,and optimizing cutting-edge data integration and analytics solutions. Your contributions will bepivotal in enhancing data-driven decision-making and delivering actionable insights across theorganization.Key Responsibilities: Develop robust data integration solutions using Oracle Data Integrator (ODI). Create, optimize, and maintain ETL/ELT workflows and processes. Configure and manage Oracle Analytics Cloud (OAC) to provide interactive dashboards and advanced analytics. Integrate and transform data from various sources to generate meaningful insights using OAC. Monitor and troubleshoot data pipelines and analytics solutions to ensure optimal performance. Ensure data quality, accuracy, and integrity across integration and reporting systems. Provide training and support to end-users for OAC and ODI solutions. Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debuggingof applications.Technical Skills: Expertise in ODI components such as Topology, Designer, Operator, and Agent. Experience in Java and weblogic development. Proficiency in developing OAC dashboards, reports, and KPIs. Strong knowledge of SQL and PL/SQL for advanced data manipulation. Familiarity with Oracle databases and Oracle Cloud Infrastructure (OCI). Experience in data modeling and designing data warehouses. Strong analytical and problem-solving abilities. Excellent communication and client-facing skills. Hands-on, end to end DWH Implementation experience using ODI. Should have experience in developing ETL processes - ETL control tables, error logging, auditing, data quality, etc. Should be able to implement reusability, parameterization,workflow design, etc. Expertise in the Oracle ODI tool set and Oracle PL/SQL,ODI knowledge of ODI Master and work repository Knowledge of data modelling and ETL design Setting up topology, building objects in Designer, Monitoring Operator, different type of KMs, Agents etc Packaging components, database operations like Aggregate pivot, union etc. using ODI mappings, error handling, automation using ODI, Load plans, Migration of Objects Design and develop complex mappings, Process Flows and ETL scripts Experience of performance tuning of mappings Ability to design ETL unit test cases and debug ETL Mappings Expertise in developing Load Plans, Scheduling Jobs Ability to design data quality and reconciliation framework using ODI Integrate ODI with multiple Source / Target Experience on Error recycling / management using ODI,PL/SQL Expertise in database development (SQL/ PLSQL) for PL/SQL based applications. HASHEDIN BY DELOITTE 2025 Experience of creating PL/SQL packages, procedures, Functions , Triggers, views, Mat Views and exception handling for retrieving, manipulating, checking and migratingcomplex datasets in oracle Experience in Data Migration using SQL loader, import/export Experience in SQL tuning and optimization using explain plan and SQL trace files. Strong knowledge of ELT/ETL concepts, design and coding Partitioning and Indexing strategy for optimal performance. Should have experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High and Lowlevel design documents. Ability to work with minimal guidance or supervision in a time critical environment. Experience: 4-6 Years of overall experience in Industry 3+ years of experience with Oracle Data Integrator (ODI) in data integration projects. 2+ years of hands-on experience with Oracle Analytics Cloud (OAC). Preferred Skills: Knowledge of Oracle Autonomous Data Warehouse (ADW) and Oracle Integration Cloud (OIC). Familiarity with other analytics tools like Tableau or Power BI. Experience with scripting languages such as Python or shell scripting. Understanding of data governance and security best practices. Educational Qualifications: Bachelors degree in Computer Science, Information Technology, Engineering, or related field.ABOUT HASHEDINWe are software engineers who solve business problems with a Product Mindset for leadingglobal organizations.By combining engineering talent with business insight, we build software and products that cancreate new enterprise value.The secret to our success is a fast-paced learning environment, an extreme ownership spirit,and a fun culture.WHY SHOULD YOU JOIN US?With the agility of a start-up and the opportunities of an enterprise, every day at HashedIn,your work will make an impact that matters.So, if you are a problem solver looking to thrive in a dynamic fun culture of inclusion,collaboration, and high performance – HashedIn is the place to be!From learning to leadership, this is your chance to take your software engineering career to thenext level.So, what impact will you make?Visit us @ https://hashedin.com
Posted 1 month ago
6.0 - 8.0 years
15 - 18 Lacs
Bengaluru
Work from Office
Role Overview For this newly created role, we are seeking a technically proficient Online Data Analyst to join our team. In this role, you will be responsible for extracting, transforming, and loading (ETL) data from various online sources using APIs, managing and querying large datasets with SQL, and ensuring the reliability and performance of our data systems through Azure monitoring and alerts. Your analytical skills will be essential in uncovering actionable insights from complex datasets, and you will work closely with cross-functional teams to support data-driven decision-making. Key Responsibilities Data Collection and Management: Extract, transform, and load (ETL) data from various sources into our online applications using tools and processes. Utilize APIs to automate data integration from diverse platforms. Maintain and enhance existing data pipelines, ensuring data integrity and consistency. Data Analysis: Conduct in-depth data analysis to uncover trends, patterns, and actionable insights. Utilize SQL for querying, managing, and manipulating large datasets. Create and maintain interactive dashboards and reports to present data insights to stakeholders. Monitoring and Alerts: Implement and manage Azure monitoring and alerting systems to ensure data workflows and applications are functioning optimally. Proactively identify and troubleshoot issues in data processes, ensuring minimal downtime and maximum reliability. Collaboration and Communication: Collaborate with cross-functional teams including marketing, product development, and IT to understand data needs and provide analytical support. Communicate complex data findings and recommendations to both technical and non-technical audiences. Contribute to continuous improvement of data processes, analytical methodologies, and best practices. Critical Competencies for Success Educational Background: Bachelors degree in Data Science, Statistics, Computer Science, or a related field. Technical Skills: Proficient in SQL for data querying and manipulation. Experience with ETL processes and tools. Strong understanding of API integration and data automation. Hands-on experience with Azure monitoring and alerting tools. Knowledge of programming languages such as HTML and Javascript is a plus. Experience: Proven experience in a data analyst role or similar position. Demonstrated experience with online data sources and web analytics. Experience with cloud platforms, particularly Azure, is required. Analytical Skills: Strong problem-solving skills and attention to detail. Ability to analyze large datasets and generate meaningful insights. Excellent statistical and analytical capabilities. Soft Skills: Strong communication and presentation skills. Ability to work independently and collaboratively in a team environment. Good organizational skills and ability to manage multiple projects simultaneously. Role & responsibilities For this newly created role, we are seeking a technically proficient Online Data Analyst to join our team. In this role, you will be responsible for extracting, transforming, and loading (ETL) data from various online sources using APIs, managing and querying large datasets with SQL, and ensuring the reliability and performance of our data systems through Azure monitoring and alerts. Your analytical skills will be essential in uncovering actionable insights from complex datasets, and you will work closely with cross-functional teams to support data-driven decision-making. Preferred candidate profile Bachelors degree in Data Science, Statistics, Computer Science, or a related field Perks and benefits Industrial standards
Posted 1 month ago
3.0 - 6.0 years
7 - 15 Lacs
Gurugram
Work from Office
Dear Candidate, Greetings!! Hiring For SSIS Developer - Gurgaon(wfo) Responsibilities 1 Must have exp into SSIS packages for ETL processes 2 End to end data migration 3 Must have exp in Oracle Cloud Share resume on abhishek@xinoe.com Regards,
Posted 1 month ago
8.0 - 12.0 years
10 - 14 Lacs
Pune
Work from Office
Before you apply to a job, select your language preference from the options available at the top right of this page. : Job Summary This position provides input, support, and performs full systems life cycle management activities (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.). He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements. This position provides input to applications development project plans and integrations. He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives. This position provides knowledge and support for applications development, integration, and maintenance. He/She provides input to department and project teams on decisions supporting projects. Responsibilities: Full stack developer with Java, Oracle and Angular. Devops and Agile project management is a plus. Plans, develops, and manages the organization's information software, applications, systems, and networks. Application Containerization (Kubernetes, Red Hat Open Shift) Experience with public cloud (e.g., Google, Azure) Performs systems analysis and design. Designs and develops moderate to highly complex applications. Develops application documentation. Produces integration builds. Performs maintenance and support. Supports emerging technologies and products. Ensures UPS's business needs are met through continual upgrades and development of new technical solutions. Qualifications: 8-12 years of experience Bachelors Degree or International equivalent Employee Type:
Posted 1 month ago
5.0 - 8.0 years
7 - 12 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
About KPI Partners . KPI Partners is a leading provider of technology consulting and solutions, specializing in delivering high-quality services that enable organizations to optimize their operations and achieve their strategic objectives. We are committed to empowering businesses through innovative solutions and a strong focus on customer satisfaction. Job Description. We are seeking an experienced and detail-oriented ODI Developer to join our dynamic team. The ideal candidate will have a strong background in Oracle Data Integration and ETL processes, possess excellent problem-solving skills, and demonstrate the ability to work collaboratively within a team environment. As an ODI Developer at KPI Partners, you will play a crucial role in designing, implementing, and maintaining data integration solutions that support our clients' analytics and reporting needs. Key Responsibilities. - Design, develop, and implement data integration processes using Oracle Data Integrator (ODI) to extract, transform, and load (ETL) data from various sources. - Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications. - Optimize ODI processes and workflows for performance improvements and ensure data quality and accuracy. - Troubleshoot and resolve technical issues related to ODI and data integration processes. - Maintain documentation related to data integration processes, including design specifications, integration mappings, and workflows. - Participate in code reviews and ensure adherence to best practices in ETL development. - Stay updated with the latest developments in ODI and related technologies to continuously improve solutions. - Support production deployments and provide maintenance and enhancements as needed. Qualifications. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Proven experience as an ODI Developer or in a similar ETL development role. - Strong knowledge of Oracle Data Integrator and its components (repositories, models, mappings, etc.). - Proficient in SQL and PL/SQL for querying and manipulating data. - Experience with data warehousing concepts and best practices. - Familiarity with other ETL tools is a plus. - Excellent analytical and troubleshooting skills. - Strong communication skills, both verbal and written. - Ability to work independently and in a team-oriented environment. Why Join KPI Partners? - Opportunity to work with a talented and diverse team on cutting-edge projects. - Competitive salary and comprehensive benefits package. - Continuous learning and professional development opportunities. - A culture that values innovative thinking and encourages collaboration. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.**
Posted 1 month ago
4.0 - 6.0 years
8 - 12 Lacs
Bengaluru
Remote
JOB DESCRIPTION JOB TITLE: Software Engineer I REPORTS TO: Lead Software Engineer ______________________________________________________________________________ POSITION SUMMARY: The Software Engineer will be responsible for developing and maintaining ETL integrations with telecom carriers, ensuring the seamless exchange of billing data, service orders, and inventory updates. The ideal candidate will bring knowledge of EDI and other electronic data standards. Candidate should have a strong understanding of telecom billing formats, and experience working with TEM platforms. Working Knowledge of Java for integration scripting and backend automation is a plus. ESSENTIAL FUNCTIONS: Design, develop, and maintain ETL integrations between telecom carriers, TEM platforms, and internal systems. Utilize ETL tools and techniques to extract, transform, and load data from EDI and other electronic format transactions into internal databases and applications, ensuring data accuracy and consistency across systems. Map and transform different electronic formats to meet both internal and partner requirements. Troubleshoot and resolve electronic transformation issues, ensuring timely and accurate data processing. Troubleshoot and resolve electronic data transmission issues, ensuring timely and accurate data exchange. Collaborate with TEM vendors and telecom providers to onboard new carriers and maintain data quality. Support end-to-end invoice processing workflows, from data ingestion to system reconciliation. Document technical specifications, mapping guidelines, and electronic format process flows. Monitor EDI and other electronic format system performance and proactively resolve issues or errors. Work cross-functionally with IT, telecom operations and other teams to implement and optimize the processes. Identify automation opportunities to improve workflow efficiency and data accuracy. Stay current on telecom industry trends, and billing standards. REQUIREMENTS: 4+ years experience in SQL preferable on Oracle or other database querying skills for data validation and troubleshooting. Familiarity with PL/SQL o TSQL Solid experience on Unix including Basic Shell Scripting User Level Experience on Linux and Microsoft operating systems Prior Experience in telecom billing formats and invoice data structures or other financial experience. Experience using Ticket management systems (ex. JIRA, SERVICENOW) Diagnostic Knowledge of FTP/SFTP for secure file transfers and batch job automation. Strong troubleshooting, analytical, and documentation skills Good organizational skills Ability to manage complex activities with high level of details. A well-organized and self-directed individual who can work with minimal supervision. Must be a quick learner of new technologies and adaptable to change. Good to Have: Familiarity with Java for backend integrations, data processing, or EDI and other electronic format middleware enhancements is a plus. Ability to independently identify, research, and resolve issues. Ability to multi-task continuously Extreme attention to detail and accuracy Ability to effectively communicate with all levels within the organization. ______________________________________________________________________________ ADDITIONAL RESPONSIBILITIES: Performs all other related duties as required or directed. Follow all safety rules and regulations. Work assigned hours as required.
Posted 1 month ago
8.0 - 13.0 years
25 - 40 Lacs
Bengaluru
Work from Office
*Must-Have Skills:* * Azure Databricks / PySpark hands-on * SQL/PL-SQL advanced level * Snowflake – 2+ years * Spark/Data pipeline development – 2+ years * Azure Repos / GitHub, Azure DevOps * Unix Shell Scripting * Cloud technology experience *Key Responsibilities:* 1. *Design, build, and manage data pipelines using Azure Databricks, PySpark, and Snowflake. 2. *Analyze and resolve production issues (Tier 2 support with weekend/on-call rotation). 3. *Write and optimize complex SQL/PL-SQL queries. 4. *Collaborate on low-level and high-level design for data solutions. 5. *Document all project deliverables and support deployment. Good to Have: Knowledge of Oracle, Qlik Replicate, GoldenGate, Hadoop Job scheduler tools like Control-M or Airflow Behavioral: Strong problem-solving & communication skills
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough