Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Please carefully review the position requirements before submitting a potential candidate for consideration. Role Summary: Qlik sense developer should build the dashboard/report by consuming the data from snowflake. A better visualization should be given to end users for a better understanding of data and to increase product adoptability. Roles and Responsibility: Requirement gathering knowledge Managing all the Qlik-related activities. Manage and follow up on all the Qlik-related Access Requests. Managing and operate the Service now tickets (Incidents, Requests) Support to Qlik development and N printing Development. Manage all the Qlik Production Movements Monitoring the Daily Qlik Jobs. N printing reports. Support to Month Start Activities. Giving Qlik training to Qlik End users. Supporting Governing activities Every Month. Preparing the Qlik-related support documents. Role Specific Competencies : Should have good data modeling skills Dashboard development Good visualization skills Admin-related activities. Knowledge of other BI & Visualization tools (e.g. Tableau, Power-BI). Optional Strong analytical and problem-solving skills. Data analysis skills Experience: 6 to 8 years For additional details regarding submission eligibility and payment terms, please refer to your contract. Only submissions from agencies with current service contracts in place will be considered.
Posted 1 week ago
3.0 - 6.0 years
0 - 0 Lacs
Chennai
Work from Office
Who we are looking for: We are seeking a skilled and motivated engineer to join our Data Infrastructure team. The Data Infrastructure engineering team is responsible for the tools and backend infrastructure that supports our data platform to optimize in performance scalability and reliability. This role requires strong focus and experience in multi-cloud based technologies, message bus systems, automated deployments using containerized applications, design, development, database management and performance, SOX compliance requirements, and implementation of infrastructure using automation through terraform and continuous delivery and batch-oriented workflows. As a Data Infrastructure Engineer at ACV Auctions, you will work alongside and mentor software and production engineers in the development of solutions to ACV’s most complex data and software problems. You will be an engineer that is able to operate in a high performing team, that can balance high quality deliverables with customer focus, have excellent communication skills, desire and ability to mentor and guide engineers, and have a record of delivering results in a fast paced environment. It is expected that you are a technical liaison that you can balance high quality delivery with customer focus, that you have excellent communication skills, and that you have a record of delivering results in a fast-paced environment. What you will be doing: Collaborate with cross-functional teams, including Data Scientists, Software Engineers, Data Engineers, and Data Analysts, to understand data requirements and translate them into technical specifications. Influence company wide engineering standards for databases, tooling, languages, and build systems. Design, implement, and maintain scalable and high-performance data infrastructure solutions, with a primary focus on data. Design, implement, and maintain tools and best practices for (but not limited to) access control, data versioning, database management, and migration strategies. Contribute, influence, and set standards for all technical aspects of a product or service including, but not limited to, coding, testing, debugging, performance, languages, database selection, management and deployment. Identify and troubleshoot database/system issues and bottlenecks, working closely with the engineering team to implement effective solutions. Write clean, maintainable, well-commented code and automation to support our data infrastructure layer. Perform code reviews, develop high-quality documentation, and build robust test suites for your products. Provide technical support for databases, including troubleshooting, performance tuning, and resolving complex issues. Collaborate with software and DevOps engineers to design scalable services, plan feature roll-out, and ensure high reliability and performance of our products. Collaborate with development teams and data science teams to design and optimize database schemas, queries, and stored procedures for maximum efficiency. Participate in the SOX audits, including creation of standards and reproducible audit evidence through automation Create and maintain documentation for database and system configurations, procedures, and troubleshooting guides. Maintain and extend (as required) existing database operations solutions for backups, index defragmentation, data retention, etc. Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Accountable for the overall performance of products and/or services within a defined area of focus. Be part of the on-call rotation. Handle multiple competing priorities in an agile, fast-paced environment. Perform additional duties as assigned What you need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment 1+ years experience architecting, developing, and delivering software products with emphasis on data infrastructure layer 1+ years work with continuous integration and build tools. 1+ years experience programing in Python 1+ years experience with Cloud platforms preferably in GCP/AWS Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Knowledge in version control systems including trunk-based development, multiple release planning, cherry picking, and rebase. Hands-on skills and the ability to drill deep into the complex system design and implementation. Experience with: DevOps practices and tools for database automation and infrastructure provisioning Programming in Python, SQL Github, Jenkins Infrastructure as code tooling, such as terraform, preferred Big data technologies and distributed databases Nice to Have Qualifications: Experience with NoSQL data stores Airflow, Docker, Containers, Kubernetes, DataDog, Fivetran Database monitoring and diagnostic tools, preferably Data Dog. Database management/administration with PostgreSQL, MySQL, Dynamo, Mongo GCP/BigQuery, Confluent Kafka Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Service Oriented Architecture/Microservices and Event Sourcing in a platform like Kafka (preferred) Familiarity with DevOps practices and tools for automation and infrastructure provisioning. Hands-on experience with SOX compliance requirements Knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Knowledge of database design principles, data modeling, architecture, infrastructure, security principles, best practices, performance tuning and optimization techniques #LI-AM1
Posted 1 week ago
10.0 - 15.0 years
0 - 0 Lacs
Chennai
Work from Office
Who we are looking for: We are seeking a skilled and motivated engineer to join our Data Infrastructure team. The Data Infrastructure engineering team is responsible for the tools and backend infrastructure that supports our data platform to optimize in performance scalability and reliability. This role requires strong focus and experience in multi-cloud based technologies, message bus systems, automated deployments using containerized applications, design, development, database management and performance, SOX compliance requirements, and implementation of infrastructure using automation through terraform and continuous delivery and batch-oriented workflows. As a Data Infrastructure Engineer at ACV Auctions, you will work alongside and mentor software and production engineers in the development of solutions to ACV’s most complex data and software problems. You will be an engineer that is able to operate in a high performing team, that can balance high quality deliverables with customer focus, have excellent communication skills, desire and ability to mentor and guide engineers, and have a record of delivering results in a fast paced environment. It is expected that you are a technical liaison that you can balance high quality delivery with customer focus, that you have excellent communication skills, and that you have a record of delivering results in a fast-paced environment. What you will be doing: Collaborate with cross-functional teams, including Data Scientists, Software Engineers, Data Engineers, and Data Analysts, to understand data requirements and translate them into technical specifications. Influence company wide engineering standards for databases, tooling, languages, and build systems. Design, implement, and maintain scalable and high-performance data infrastructure solutions, with a primary focus on data. Design, implement, and maintain tools and best practices for (but not limited to) access control, data versioning, database management, and migration strategies. Contribute, influence, and set standards for all technical aspects of a product or service including, but not limited to, coding, testing, debugging, performance, languages, database selection, management and deployment. Identify and troubleshoot database/system issues and bottlenecks, working closely with the engineering team to implement effective solutions. Write clean, maintainable, well-commented code and automation to support our data infrastructure layer. Perform code reviews, develop high-quality documentation, and build robust test suites for your products. Provide technical support for databases, including troubleshooting, performance tuning, and resolving complex issues. Collaborate with software and DevOps engineers to design scalable services, plan feature roll-out, and ensure high reliability and performance of our products. Collaborate with development teams and data science teams to design and optimize database schemas, queries, and stored procedures for maximum efficiency. Participate in the SOX audits, including creation of standards and reproducible audit evidence through automation Create and maintain documentation for database and system configurations, procedures, and troubleshooting guides. Maintain and extend (as required) existing database operations solutions for backups, index defragmentation, data retention, etc. Respond-to and troubleshoot highly complex problems quickly, efficiently, and effectively. Accountable for the overall performance of products and/or services within a defined area of focus. Be part of the on-call rotation. Handle multiple competing priorities in an agile, fast-paced environment. Perform additional duties as assigned What you need: Bachelor’s degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak, and understand English Strong communication and collaboration skills, with the ability to work effectively in a fast paced global team environment 1+ years experience architecting, developing, and delivering software products with emphasis on data infrastructure layer 1+ years work with continuous integration and build tools. 1+ years experience programing in Python 1+ years experience with Cloud platforms preferably in GCP/AWS Knowledge in day-day tools and how they work including deployments, k8s, monitoring systems, and testing tools. Knowledge in version control systems including trunk-based development, multiple release planning, cherry picking, and rebase. Hands-on skills and the ability to drill deep into the complex system design and implementation. Experience with: DevOps practices and tools for database automation and infrastructure provisioning Programming in Python, SQL Github, Jenkins Infrastructure as code tooling, such as terraform, preferred Big data technologies and distributed databases Nice to Have Qualifications: Experience with NoSQL data stores Airflow, Docker, Containers, Kubernetes, DataDog, Fivetran Database monitoring and diagnostic tools, preferably Data Dog. Database management/administration with PostgreSQL, MySQL, Dynamo, Mongo GCP/BigQuery, Confluent Kafka Using and integrating with cloud services, specifically: AWS RDS, Aurora, S3, GCP Service Oriented Architecture/Microservices and Event Sourcing in a platform like Kafka (preferred) Familiarity with DevOps practices and tools for automation and infrastructure provisioning. Hands-on experience with SOX compliance requirements Knowledge of data warehousing concepts and technologies, including dimensional modeling and ETL frameworks Knowledge of database design principles, data modeling, architecture, infrastructure, security principles, best practices, performance tuning and optimization techniques #LI-AM1
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer at our organization, you will be responsible for designing, implementing, and maintaining data pipelines and data integration solutions using Azure Synapse. Your role will involve developing and optimizing data models and data storage solutions on Azure. You will collaborate closely with data scientists and analysts to implement data processing and data transformation tasks. Ensuring data quality and integrity through data validation and cleansing methodologies will be a key aspect of your responsibilities. Your duties will also include monitoring and troubleshooting data pipelines to identify and resolve performance issues promptly. Collaboration with cross-functional teams to understand and prioritize data requirements will be essential. It is expected that you stay up-to-date with the latest trends and technologies in data engineering and Azure services to contribute effectively to the team. To be successful in this role, you are required to possess a Bachelor's degree in IT, computer science, computer engineering, or a related field, along with a minimum of 8 years of experience in Data Engineering. Proficiency in Microsoft Azure Synapse Analytics is crucial, including experience with Azure Data Factory, Dedicated SQL Pool, Lake Database, and Azure Storage. Hands-on experience in Spark notebooks (Python or Scala) is mandatory for this position. Your expertise should also cover end-to-end Data Warehouse experience, including ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and data security. Advanced SQL and relational database knowledge, as well as demonstrated experience in designing and delivering data platforms for Business Intelligence and Data Warehouse, are required skills. Strong analytical abilities to handle and analyze complex, high-volume data with attention to detail are essential. Familiarity with data modeling and data warehousing concepts such as DataVault or 3NF, along with experience in Data Governance (Quality, Lineage, Data dictionary, and Security), is preferred. Knowledge of Agile methodology and working environment is beneficial for this role. You should also exhibit the ability to work independently with Product Owners, Business Analysts, and Architects. Join us at NTT DATA Business Solutions, where we empower you to transform SAP solutions into value. If you have any questions regarding this job opportunity, please reach out to our Recruiter, Pragya Kalra, at Pragya.Kalra@nttdata.com.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
Myers-Holum is expanding operations to India, and is actively seeking experienced Principal Consultants with strong data warehousing and business intelligence experience to play a pivotal role in the expansion of the India Practice. As a Principal Consultant at Myers-Holum, you will be responsible for designing, building, and testing custom data warehouse and BI solutions for clients using the Oracle NetSuite Analytics Warehouse (NSAW) platform. Your role will involve translating business requirements into technical specifications, developing custom data integration solutions to ingest data from multiple sources, and leading the product implementation from start to finish. As a trusted advisor, you will interact with stakeholders and end users, conduct product training, and run technical workshops to ensure successful implementations. In addition to client-facing responsibilities, you will contribute to internal MHI initiatives such as resource mentorship and ongoing education, as well as implementing NSAW Best Practices. To excel in this role, you should possess 10+ years of relevant professional experience, with at least 6 years in data management using Oracle or other relational databases. You should also have a solid background in data warehouse and data integration projects, proficiency in cloud or on-premise data integration tools, and experience with SQL and BI tools such as Oracle OBIEE, OAC, Looker, PowerBI, Qlikview, or Tableau. Knowledge of ERP data and business processes, along with strong business analysis and communication skills, are essential for this position. At Myers-Holum, we value collaboration, innovation, and personal growth. As an MHIer, you will have the opportunity to work with a diverse team of professionals and shape your future while positively influencing change for our customers. With over 40 years of experience, Myers-Holum offers stability and growth opportunities, with a global presence and a strong network of technology partners. Our company culture emphasizes curiosity, humility, and resilience, as we strive for continuous improvement and meaningful growth. Join us at Myers-Holum and discover a rewarding career path with access to training, certification support, career advancement opportunities, and comprehensive health benefits. We offer a remote working option, a supportive work environment, and a structured interview process designed to showcase your strengths and align your career aspirations with our organization's goals. Expect a collaborative recruitment experience with timely feedback and flexible timelines to accommodate your needs. If you are ready to embark on a fulfilling journey with Myers-Holum, apply now and become part of our dynamic team dedicated to driving innovation and excellence in data warehousing and business intelligence.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Manager at Autodesk, you will lead the BI and Data Engineering Team to develop and implement business intelligence solutions. Your role is crucial in empowering decision-makers through trusted data assets and scalable self-serve analytics. You will oversee the design, development, and maintenance of data pipelines, databases, and BI tools to support data-driven decision-making across the CTS organization. Reporting to the leader of the CTS Business Effectiveness department, you will collaborate with stakeholders to define data requirements and objectives. Your responsibilities will include leading and managing a team of data engineers and BI developers, fostering a collaborative team culture, managing data warehouse plans, ensuring data quality, and delivering impactful dashboards and data visualizations. You will also collaborate with stakeholders to translate technical designs into business-appropriate representations, analyze business needs, and create data tools for analytics and BI teams. Staying up to date with data engineering best practices and technologies is essential to ensure the company remains ahead of the industry. To qualify for this role, you should have 3 to 5 years of experience managing data teams and a BA/BS in Data Science, Computer Science, Statistics, Mathematics, or a related field. Proficiency in Snowflake, Python, SQL, Airflow, Git, and big data environments like Hive, Spark, and Presto is required. Experience with workflow management, data transformation tools, and version control systems is preferred. Additionally, familiarity with Power BI, AWS environment, Salesforce, and remote team collaboration is advantageous. The ideal candidate is a data ninja and leader who can derive insights from disparate datasets, understand Customer Success, tell compelling stories using data, and engage business leaders effectively. At Autodesk, we are committed to creating a culture where everyone can thrive and realize their potential. Our values and ways of working help our people succeed, leading to better outcomes for our customers. If you are passionate about shaping the future and making a meaningful impact, join us in our mission to turn innovative ideas into reality. Autodesk offers a competitive compensation package based on experience and location. In addition to base salaries, we provide discretionary annual cash bonuses, commissions, stock grants, and a comprehensive benefits package. If you are interested in a sales career at Autodesk or want to learn more about our commitment to diversity and belonging, please visit our website for more information.,
Posted 1 week ago
10.0 - 16.0 years
0 Lacs
hyderabad, telangana
On-site
As a SAP Data Migration Consultant at Syniti, you will play a crucial role in SAP Implementation projects by managing various data migration activities. Your responsibilities will include data analysis, reporting, conversion, harmonization, and business-process analysis using SAP and other Enterprise Data Migration Tools. To excel in this role, you must have a strong background in SAP and be an expert in specific business-process areas. You will be actively involved in data migration activities for a specific process thread, engaging with client Subject Matter Experts (SMEs) and Business-Process Experts. Familiarity with the onsite-offshore delivery model is essential for success in this position. The physical demands of this role are limited to office routines, with occasional travel required to various locations across regions. Qualifications: - 11-16 years of SAP Techno-Functional or Functional experience, including involvement in 3+ full SAP implementation lifecycles - Expertise in business-process knowledge related to SAP functional modules such as FI, CO, MM, SD, PM, PP, PS - Over 10 years of experience in IT projects - Proficiency in BackOffice CranSoft/DSP/SAP Data Services/other Data Migration tools - Extensive experience in data quality, data migration, data warehousing, data analysis, and conversion planning - 5 to 7 years of Business-Process experience - Bachelor's degree in Business, Engineering, Computer Science, or related disciplines, or equivalent experience - Proficiency in Microsoft SQL, including SQL query skills and understanding of relational databases Job Responsibilities: - Conduct expert level Business Analysis on SAP modules like FI, CO, MM, SD, PM, PP, PS - Lead and guide the team based on project requirements, ensuring client needs are met - Communicate effectively with onsite teams and client personnel - Facilitate blueprint sessions with onsite/client teams - Develop and maintain SAP Data Migration plan, Integration plan, and Cutover Plan - Perform SAP Data Extraction, Transformation, and Loading - Implement change management and defect management processes - Document all relevant activities - Train new team members on SAP Migration Toolsets If you are looking to leverage your SAP expertise and contribute to impactful data migration projects, this role at Syniti offers a dynamic opportunity to excel in a collaborative and innovative environment.,
Posted 1 week ago
5.0 - 9.0 years
9 - 12 Lacs
Gurugram, Bengaluru
Work from Office
SQL developer with strong knowledge of SQL Programming and commands, querying, aggregation, joining and manipulating data. ETL, Data transformation and data warehousing, Data Manipulation, Data Visualisation
Posted 1 week ago
1.0 - 3.0 years
3 - 6 Lacs
Bengaluru
Work from Office
[{"Salary":"4500000" , "Remote_Job":false , "Posting_Title":"Data Story Teller" , "Is_Locked":false , "City":"Bangalore" , "Industry":"Technology" , "Job_Description":" Were looking for a **Data Storyteller/UX Analyst**who can bridge the gap between data and decision-making. Youll work withanalysts, business leaders, and product teams to craft compelling stories fromcomplex data, enabling smarter decisions across the organization by providingactionable insights. Eligibility Criteria: Years of Experience : Minimum 14 years Job Experience: o Experience with Data Analysis/Data Profiling, Visualization tools ( Power BI) o Experience in Database and Data warehousetech (Azure Synapse/ SQL Server/SAPHANA/MS fabric) o Experience in Stakeholdermanagement / requirement gathering/delivery cycle. Educational : o BachelorDegree: Math/Statistics/Operations Research/ComputerScience o MasterDegree : BusinessAnalytics (with a background in Computer Science) Primary Responsibilities: -Translate complex data analyses into clear, engaging narratives tailored todiverse audiences. - Develop impactful data visualizations and dashboards using tools like PowerBI or Tableau. - Educateand Mentor team to develop the insightful dashboards by using multiple DataStory telling methodologies. - Collaborate with Data Analysts, Data Scientists, Business Analysts and Businessstakeholders to uncover insights. - Understand business goals and align analytics storytelling to drive strategicactions. - Create presentations, reports, and visual content to communicate insightseffectively. - Maintain consistency in data communication and ensure data-drivenstorytelling best practices. Mandatory Skills required to perform the job: Data Analysisskills, experience in extracting information from databases, Office 365 Professional and Proven Data Storytellerthrough BI Experience in Agile/SCRUMprocess and development using any tools. Knowledge of SAPsystems (SAP ECC T-Codes & Navigation) Proven abilityto tell stories with data, combining analytical rigor with creativity. Strong skills indata visualization tools (eg, Tableau, Power BI) and presentation tools(eg, PowerPoint, Google Slides). Proficiency inSQL and basic understanding of statistical methods or Python/R is a plus. Excellentcommunication and collaboration skills. Ability todistill complex information into easy-to-understand formats. Desirable Skills: Background injournalism, design, UX, or marketing alongside analytics. Experienceworking in fast-paced, cross-functional teams. Familiarity withdata storytelling frameworks or narrative design. ","Job_Type":"Full time","Job_Opening_Name":"Data Story Teller" , "State":"Karnataka" , "Country":"India" , "Zip_Code":"560048" , "id":"153957000004621904" , "Publish":true , "Date_Opened":"2025-07-22" , "Keep_on_Career_Site":false}]
Posted 1 week ago
2.0 - 5.0 years
12 - 13 Lacs
Noida
Work from Office
Department: Emergency Response / Trauma Care Coordination Location: Central Command Centre, NHAI HQ or Designated Regional Centre Job Type: Full-time / Contractual (based on project) Job Purpose: To provide medical expertise and support for managing trauma care coordination across the National Highways network. The role involves real-time monitoring, triage support, emergency coordination with ambulances and hospitals, and supporting the implementation of NHAI's trauma care response system. Key Responsibilities: Command Centre Operations: Monitor and manage real-time data from highway accident alert systems. Coordinate with ambulance networks, local hospitals, and traffic police for emergency response. Ensure appropriate triage and patient routing to nearest suitable medical facility. Medical Triage and Advisory: Provide initial medical triage over calls or software dashboard. Support ambulance staff or first responders with medical guidance, if required. Data & Incident Management: Maintain records of incidents, response times, patient status, and follow-up outcomes. Identify patterns in accident data and provide input for preventive strategies. Coordination & Liaison: Coordinate with state health departments, AIIMS trauma centers, district hospitals, and NHAI field staff. Support the implementation of Standard Operating Procedures (SOPs) for trauma response. Training & Capacity Building: Train and support call center executives and ambulance staff in basic trauma protocols. Assist in simulation drills and mock exercises Qualifications: Essential: MBBS degree from a recognized institution. Valid registration with Medical Council of India (MCI) or State Medical Council. Desirable: Experience in Emergency Medicine / Trauma Care / ICU. Certification in Basic Life Support (BLS) / Advanced Trauma Life Support (ATLS) preferred. Experience: Minimum 2–5 years of clinical experience, preferably in emergency services or trauma care settings. Experience working in a command/control center or telemedicine setup is an advantage.
Posted 1 week ago
4.0 - 8.0 years
15 - 25 Lacs
Hyderabad
Work from Office
Job Description: We are looking for a Data Engineer with strong hands-on experience in ETL, cloud data platforms, and scripting to work on scalable data integration solutions. Mandatory Skills: SQL Strong expertise in writing optimized queries and procedures. Data Warehousing (DWH) Good understanding of data modeling and warehouse architecture. Shell scripting or Python For automation and custom transformation logic. ETL Tool – Experience with any ETL tool (Talend/Informatica/Datastage etc). DataBricks – Used for data transformation and processing. Azure Data Factory (ADF) – Designing and orchestrating data pipelines. Good to Have: Snowflake – For implementing scalable cloud data warehousing solutions. Azure Ecosystem – General familiarity with Azure services including Data Lake and Storage. Responsibilities: Build and maintain scalable ETL pipelines using Talend, ADF, and DataBricks. Extract, transform, and load data from multiple source systems into Snowflake and/or Azure Data Lake. Interpret technical and functional designs and implement them effectively in the data pipeline. Collaborate with teams to ensure high data quality and performance. Support and guide ETL developers in resolving technical challenges and implementing best practices.
Posted 1 week ago
8.0 - 10.0 years
20 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role - ODI Developer Exp - 6 to 8 Yrs Position - Permanent FTE Loc - Hyderabad, Bangalore, Pune Job Description We are seeking an experienced and detail-oriented ODI Developer to join our dynamic team. The ideal candidate will have a strong background in Oracle Data Integration and ETL processes, possess excellent problem-solving skills, and demonstrate the ability to work collaboratively within a team environment. As an ODI Developer at KPI Partners, you will play a crucial role in designing, implementing, and maintaining data integration solutions that support our clients' analytics and reporting needs. Key Responsibilities - Design, develop, and implement data integration processes using Oracle Data Integrator (ODI) to extract, transform, and load (ETL) data from various sources. - Collaborate with business analysts and stakeholders to understand data requirements and translate them into technical specifications. - Optimize ODI processes and workflows for performance improvements and ensure data quality and accuracy. - Troubleshoot and resolve technical issues related to ODI and data integration processes. - Maintain documentation related to data integration processes, including design specifications, integration mappings, and workflows. - Participate in code reviews and ensure adherence to best practices in ETL development. - Stay updated with the latest developments in ODI and related technologies to continuously improve solutions. - Support production deployments and provide maintenance and enhancements as needed. Qualifications - Proven experience as an ODI Developer or in a similar ETL development role. - Strong knowledge of Oracle Data Integrator and its components (repositories, models, mappings, etc.) . - Proficient in SQL and PL/SQL for querying and manipulating data. - Experience with data warehousing concepts and best practices. - Familiarity with other ETL tools is a plus. - Excellent analytical and troubleshooting skills. - Strong communication skills, both verbal and written. - Ability to work independently and in a team-oriented environment.
Posted 1 week ago
4.0 - 6.0 years
10 - 14 Lacs
Navi Mumbai
Work from Office
We're hiring a Data Engineer to design and maintain scalable data pipelines, optimize ETL processes, and support cloud-based data solutions (GCP/Azure). Requires Python/Java/SQL skills, cloud experience, and strong collaboration abilities
Posted 1 week ago
5.0 - 10.0 years
8 - 12 Lacs
Bareilly
Work from Office
Job Overview We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals. Responsibilities - Data Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives. Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility. Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications / Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role. Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights. Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones.
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Bharatpur
Work from Office
Who are you 2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYoure comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools.
Posted 1 week ago
5.0 - 10.0 years
8 - 12 Lacs
Madurai
Work from Office
Job Overview We are seeking an experienced and strategic Data and Reporting Lead to shape our data strategy and drive data-driven decision-making across the organization. This role will focus on developing a comprehensive data infrastructure, ensuring data accuracy, and providing critical insights to support our business goals.ResponsibilitiesData Strategy & Governance: Develop and implement a data strategy that aligns with organizational goals. Establish governance policies to maintain data quality, consistency, and security.Team Leadership: Provide training and development to enhance the teams skills in data management and reporting.Reporting & Analytics: Oversee the creation of dashboards and reports, delivering key insights to stakeholders. Ensure reports are accessible, reliable, and relevant, with a focus on performance metrics, customer insights, and operational efficiencies.Cross-functional Collaboration: Work closely with cross-functional teams (Tech, Finance, Operations, Marketing, Credit and Analytics) to identify data requirements, integrate data across systems, and support data-driven initiatives.Data Infrastructure & Tools: Work with Data Engineering to assess, select, and implement data tools and platforms to optimize data storage, processing, and reporting capabilities. Maintain and improve our data infrastructure to support scalability and data accessibility.Data Compliance: Ensure adherence to data privacy laws and compliance standards, implementing best practices in data security and privacy. Qualifications Experience: 5-10 years of experience in data management and reporting with at least some in a leadership role.Education: Bachelors or Masters degree in Data Science, Business Analytics, Statistics, Computer Science, or a related field (STEM).Technical Skills: Proficiency in data visualization tools (Metabase, Sisense, Tableau, Power BI), SQL, and data warehousing solutions. Knowledge of ETL processes and familiarity with cloud data platforms is a plus.Analytical Skills: Strong analytical abilities and a strategic mindset, with proven experience in translating data into actionable business insights.Leadership & Communication: Excellent leadership, communication, and presentation skills. Ability to communicate complex information clearly to both technical and non-technical stakeholders.Strong strategic thinking and problem-solving skillsEnthusiasm for working across cultures, functions, and time zones
Posted 1 week ago
3.0 - 8.0 years
5 - 7 Lacs
Hyderabad
Work from Office
What youll do on a typical day: Data Visualization: Design, develop, and maintain interactive data visualizations and reports using looker. Data Modeling: Create and optimize data models to support business requirements. Data Integration: Integrate looker reports into other applications for enhanced business capabilities. Performance Optimization: Monitor and optimize the performance of looker reports and dashboards. Collaboration: Work with business stakeholders to understand their data visualization and business intelligence needs. Continuously improve technical design patterns, workflows, and tools—defining and enforcing standards when necessary to sustain the platform’s effectiveness and sustainability, experimenting with and promoting adoption of new tools and approaches when appropriate Security: Implement different security measures on data and ensure compliance with data governance policies. Documentation: Document processes and methodologies used in developing reporting solutions. What you need to succeed at XPO: At a minimum, you’ll need: Qualification: Bachelor’s / master’s degree in computer science, Information Technology, or a related field. Experience: 3 plus years of experience in data analysis, data visualization, and business intelligence using BI Tools (Looker, Power BI, Tableau, etc.), Technical Skills: Proficiency in writing SQL queries, solid understanding of Data Warehouse and data modeling concepts. Analytical Skills: Strong analytical and problem-solving skills. Communication: Excellent communication and teamwork skills. Experience with Cloud Platforms such as Google Cloud Platform and Google Big Query Experience with programming languages like python, R etc. Understanding of version controlling tools GitHub, SVN, TFS etc. Google Cloud Platform or Looker certification a big plus
Posted 1 week ago
6.0 - 11.0 years
8 - 14 Lacs
Hyderabad
Work from Office
Looking to onboard a skilled SAP IS Retail Professional with 6-12 years of experience. The ideal candidate will have a strong background in SAP IS retail and excellent problem-solving skills. Roles and Responsibility Collaborate with cross-functional teams to design and implement SAP IS retail solutions. Analyze business requirements and develop effective solutions using SAP IS retail. Provide technical support and training to end-users on SAP IS retail applications. Develop and maintain documentation for SAP IS retail projects. Troubleshoot and resolve issues related to SAP IS retail. Participate in project planning and execution to ensure timely delivery. Job Requirements Strong knowledge of SAP IS retail modules and functionalities. Experience in designing and implementing SAP IS retail solutions. Excellent problem-solving skills and attention to detail. Ability to work collaboratively with cross-functional teams. Strong communication and interpersonal skills. Familiarity with industry-specific regulations and standards.
Posted 1 week ago
6.0 - 10.0 years
25 - 30 Lacs
Pune
Hybrid
Design, implement, optimize ETL/ELT pipelines to ingest, transform, and load data into AWS Redshift from various sources Strong background in Python scripting, AWS services (Lambda, S3, Redshift),Data Integration & Pipeline Development Required Candidate profile 6 + years of exp. in BI development, data engineering. • Python/R scripting for data processing/ automation. • AWS services: Lambda, S3, and Redshift. • Data warehousing • Proficiency in SQL
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Nellore
Work from Office
Full Time Role at EssentiallySports for Data Growth EngineerEssentiallySports is the home for the underserved fan, delivering storytelling that goes beyond the headlines. As a media platform, we combine deep audience insights with cultural trends, to meet fandom where it lives and where it goes next. ValuesFocus on the user and all else will followHire for Intent and not for ExperienceBootstrapping gives you the freedom to serve the customer and the team instead of investorInternet and Technology untap the nichesAction oriented, integrity, freedom, strong communicators, and responsibilityAll things equal, one with high agency winsEssentiallySports is a top 10 sports media platform in the U. S. , generating over a billion pageviews a year and 30m+ monthly active users per month. This massive traffic fuels our data-driven culture, allowing us to build owned audiences at scale through organic growth—a model we take pride in, with zero CAC. The next phase of ES growth is around newsletter initiative, in less than 9 months, we’ve built a robust newsletter brand with 700,000+ highly engaged readers and impressive performance metrics:5 newsletter brands700k+ subscribersOpen rates of 40%-46%. The role is for a data engineer with growth and business acumen, in the “permissionless growth” team. Someone who can connect the pipelines of millions of users, but at the same time knit a story of the how and why. ResponsibilitiesOwning Data Pipeline from Web to Athena to Email, end-to-endYou’ll make the key decisions and see them through to successful user sign upUse Data Science to find real insights, which translates to user engagementPushing changes every week dayPersonalization at Scale: Leverage fan behavior data to tailor content and improve lifetime value. Who are you?2+ years of professional data engineering experienceSomeone who spends time thinking about business insights as much as they do on engineeringIs a self-starter, and drives initiativesIs excited to pick up AI, and integrate it at various touch pointsYou have strong experience in data analysis, growth marketing, or audience development (media or newsletters? Even better). Have an awareness about Athena, Glue, Jupyter, or intent to pick them upYou’re comfortable working with tools like Google Analytics, SQL, email marketing platforms (Beehiiv is a plus), and data visualization tools. Collaborative and want to see the team succeed in its goalsProblem solving, proactive and solution oriented mindset, to spot opportunities and translate into real growthAbility to thrive in startups with a fast-paced environment and take ownership for working through ambiguityExcited to join a lean team in a big company that moves quickly
Posted 1 week ago
10.0 - 15.0 years
20 Lacs
Chennai
Work from Office
Candidate Specification: Any Graduate, Min 10+; years relevant Experience; Job Description: Strong hands-on experience with the following: Snowflake, Redshift, Big Query.; Proficiency in; Data Build Tool - DBT and SQL-based data modeling and transformation. Solid understanding of data warehousing concepts, star/snowflake schemas, and performance optimization. Experience with modern ETL/ELT tools and cloud-based data pipeline frameworks. Familiarity with version control systems (e.g., Git) and CI/CD practices for data workflows. Strong problem-solving skills and attention to detail. Should have excellent Inter Personal skill. Contact Person: Deepikad Email ID : deepikad@gojobs.biz
Posted 1 week ago
5.0 - 10.0 years
16 - 20 Lacs
Bengaluru
Work from Office
Project description We are seeking a highly skilled Data Modelling Expert with deep experience in the Avaloq Core Banking platform to join our technology team. The ideal candidate will be responsible for designing, maintaining, and optimizing complex data models that support our banking products, client data, and regulatory reporting needs. This role requires a blend of domain expertise in banking and wealth management, strong data architecture capabilities, and hands-on experience working with the Avaloq platform. Responsibilities Design, implement, and maintain conceptual, logical, and physical data models within the Avaloq Core Banking system. Collaborate with business analysts, product owners, and Avaloq parameterisation teams to translate business requirements into robust data models. Ensure alignment of data models with Avaloq's object model and industry best practices. Perform data profiling, quality checks, and lineage tracing to support regulatory and compliance requirements (e.g., Basel III, MiFID II, ESG). Support integration of Avaloq data with downstream systems (e.g., CRM, data warehouses, reporting platforms). Provide expert input on data governance, metadata management, and model documentation. Contribute to change requests, upgrades, and data migration projects involving Avaloq. Collaborate with cross-functional teams to drive data consistency, reusability, and scalability. Review and validate existing data models, identify gaps or optimisation opportunities. Ensure data models meet performance, security, and privacy requirements. Skills Must have Proven experience (5+ years) in data modelling or data architecture, preferably within financial services. 3+ years of hands-on experience with Avaloq Core Banking Platform, especially its data structures and object model. Strong understanding of relational databases and data modelling tools (e.g., ER/Studio, ERwin, or similar). Proficient in SQL and data manipulation in Avaloq environments. Knowledge of banking products, client lifecycle data, and regulatory data requirements. Familiarity with data governance, data quality, and master data management concepts. Experience working in Agile or hybrid project delivery environments. Nice to have Exposure to Avaloq Scripting or parameterisation is a strong plus. Experience integrating Avaloq with data lakes, BI/reporting tools, or regulatory platforms. Understanding of data privacy regulations (GDPR, FINMA, etc.). Certification in Avaloq or relevant financial data management domains is advantageous.
Posted 1 week ago
8.0 - 13.0 years
18 - 22 Lacs
Mumbai, Chennai, Bengaluru
Work from Office
At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities,collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow.Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose. Your role In this role you will play a key role in Data Strategy - We are looking for a 8+ years experience in Data Strategy (Tech Architects, Senior BAs) who will support our product, sales, leadership teams by creating data-strategy roadmaps. The ideal candidate is adept at understanding the as-is enterprise data models to help Data-Scientists/ Data Analysts to provide actionable insights to the leadership. They must have strong experience in understanding data, using a variety of data tools. They must have a proven ability to understand current data pipeline and ensure minimal cost-based solution architecture is created & must be comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes. Identify, design, and recommend internal process improvementsautomating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. & identify data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to create frameworks for digital twins/ digital threads having relevant experience in data exploration & profiling, involve in data literacy activities for all stakeholders & coordinating with cross functional team ; aka SPOC for global master data Your Profile 8+ years of experience in a Data Strategy role, who has attained a Graduate degree in Computer Science, Informatics, Information Systems, or another quantitative field. They should also have experience using the following software/tools - Experience with understanding big data toolsHadoop, Spark, Kafka, etc. & experience with understanding relational SQL and NoSQL databases, including Postgres and Cassandra/Mongo dB & experience with understanding data pipeline and workflow management toolsLuigi, Airflow, etc. 5+ years of Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.Postgres/ SQL/ Mongo & 2+ years working knowledge in Data StrategyData Governance/ MDM etc. Having 5+ years of experience in creating data strategy frameworks/ roadmaps, in Analytics and data maturity evaluation based on current AS-is vs to-be framework and in creating functional requirements document, Enterprise to-be data architecture. Relevant experience in identifying and prioritizing use case by for business; important KPI identification opex/capex for CXO's with 2+ years working knowledge in Data StrategyData Governance/ MDM etc. & 4+ year experience in Data Analytics operating model with vision on prescriptive, descriptive, predictive, cognitive analytics What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. Location - Bengaluru,Mumbai,Chennai,Pune,Hyderabad,Noida
Posted 1 week ago
1.0 - 5.0 years
3 - 7 Lacs
Pune
Work from Office
The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you JD DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.
Posted 1 week ago
3.0 - 6.0 years
4 - 8 Lacs
Mumbai
Work from Office
The Apex Group was established in Bermuda in 2003 and is now one of the worlds largest fund administration and middle office solutions providers. Our business is unique in its ability to reach globally, service locally and provide cross-jurisdictional services. With our clients at the heart of everything we do, our hard-working team has successfully delivered on an unprecedented growth and transformation journey, and we are now represented by over circa 13,000 employees across 112 offices worldwide.Your career with us should reflect your energy and passion. Thats why, at Apex Group, we will do more than simply empower you. We will work to supercharge your unique skills and experience. Take the lead and well give you the support you need to be at the top of your game. And we offer you the freedom to be a positive disrupter and turn big ideas into bold, industry-changing realities. For our business, for clients, and for you DisclaimerUnsolicited CVs sent to Apex (Talent Acquisition Team or Hiring Managers) by recruitment agencies will not be accepted for this position. Apex operates a direct sourcing model and where agency assistance is required, the Talent Acquisition team will engage directly with our exclusive recruitment partners.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France