Jobs
Interviews

8586 Data Modeling Jobs - Page 46

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

6 - 11 Lacs

Chennai

Work from Office

Summary:We are seeking an experienced Data Scientist to join our team at Test Demo in Chennai As a Data Scientist with 4 years of experience, you will play a key role in driving business growth through data-driven insights and recommendations This is an in-office role, and we are looking for someone who is passionate about working with data and has a strong analytical mindset As a Data Scientist at Test Demo, you will be responsible for collecting, analyzing, and interpreting large data sets to identify trends and patterns You will work closely with cross-functional teams to develop and implement data-driven solutions that drive business outcomes Your expertise in machine learning, statistical modeling, and data visualization will be essential in helping us make informed business decisions The ideal candidate will have a strong foundation in computer science, statistics, and mathematics, with experience working with programming languages such as Python, R, or SQL You will be expected to have a strong understanding of data structures, algorithms, and data modeling techniques At Test Demo, we offer a collaborative and supportive work environment that fosters growth and learning Join our team in Chennai and be a part of our mission to drive innovation through data science ResponsibilitiesCollecting, analyzing, and interpreting large data sets to identify trends and patternsWorking closely with cross-functional teams to develop and implement data-driven solutions that drive business outcomesApplying expertise in machine learning, statistical modeling, and data visualization to help make informed business decisionsDesigning and implementing data models, algorithms, and data structures to solve complex business problemsCommunicating insights and recommendations to stakeholders through clear and actionable reports and presentationsRequirements4 years of experience in Data ScienceStrong foundation in computer science, statistics, and mathematicsExperience working with programming languages such as Python, R, or SQLStrong understanding of data structures, algorithms, and data modeling techniques

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Its fun to work in a company where people truly BELIEVE in what theyre doing! Were committed to bringing passion and customer focus to the business. Job Description This role requires working from our local Hyderabad office 2-3x a week. INTRODUCTION: We are seeking a Data Engineer to join our Data team to deliver, maintain, and evolve our data platform, fulfilling our mission to empower our customers by giving them access to their data through reports powered by ABC Fitness. In this role, you ll develop and maintain our data infrastructure and work closely with cross-functional teams to translate business requirements into technical solutions, ensuring data integrity, scalability, and efficiency. We are known for being an energized, fun, friendly and customer-focused cross-functional team. WHAT YOU LL DO: Design, develop, and maintain efficient data pipelines to serve data to report, using various cloud ETL tools. Design, implement and manage data workflows, ensuring seamless data orchestration and integration. Create and optimize SQL objects, including stored procedures, tables and views for great performance. Implement data quality checks and validation processes to ensure accuracy, completeness, and consistency of data across different stages of the pipeline. Monitor data pipelines and processes, troubleshooting issues, and implement solutions to prevent recurrence. Ability to work on own initiative and take responsibility for delivery of high-quality solutions. Collaborate with stakeholders to understand reporting requirements and provide support in developing interactive dashboards using Power BI for data visualization. Maintain comprehensive documentation of data pipelines, workflows, and data models. Adhere to best practices in data engineering and ensure compliance with organizational standards. WHAT YOU LL NEED: Minimum 2-5 years of experience in a data engineering role Bachelors degree in computer science, Information Technology, or a related field Experience in data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching Proficient in SQL and Python, with the ability to translate complexity into efficient code Experience working on different type of databases, Azure SQL DB, Azure Synapse SQL Pool, AWS Redshift, MySQL, etc Experience with Azure DevOps and/or GitHub Experience with Azure Data Factory and/or Apache Airflow Effective communication skills (verbal and written) in English Genuine passion about technology and solving data problems Structured thinking with the ability to break down ambiguous problems and propose impactful data modeling designs Ability to use data to inform decision making and drive outcomes Ability to understand, document and convert business requirements into data models Ability to work effectively with a remote team across multiple time zones Driven and self-motivated with excellent organizational skills Comfortable learning innovative technologies and systems All applicants must be able to work from our Hyderabad office 2-3x a week AND IT S GREAT TO HAVE Experience building data models for Power BI Working knowledge of Gen 2 Azure Data Lake, Storage Account, Blobs, Azure Function, Logic App Working knowledge of AWS S3, EMR, EKR WHAT S IN IT FOR YOU: Purpose led company with a Values focused culture Best Life, One Team, Growth Mindset Time Off competitive PTO plans with 15 Earned accrued leave, 12 days Sick leave, and 12 days Casual leave per year 11 Holidays plus 4 Days of Disconnect once a quarter, we take a collective breather and enjoy a day off together around the globe. #oneteam Group Mediclaim insurance coverage of INR 500,000 for employee + spouse, 2 kids, and parents or parent-in-laws, and including EAP counseling Life Insurance and Personal Accident Insurance Best Life Perk we are committed to meeting you wherever you are in your fitness journey with a quarterly reimbursement Premium Calm App enjoy tranquility with a Calm App subscription for you and up to 4 dependents over the age of 16 Support for working women with financial aid towards cr che facility, ensuring a safe and nurturing environment for their little ones while they focus on their careers. We re committed to diversity and passion, and encourage you to apply, even if you don t demonstrate all the listed skillsets! ABC S COMMITMENT TO DIVERSITY, EQU ALITY , BELONGING AND INCLUSION: ABC is an equal opportunity employer. W e celebrate diversity and are committed to creating an inclusive environment for all employees . We are intentional about creating an environment where employees, our clients and other stakeholders feel valued and inspired to reach their full potential and make authentic connections. We foster a workplace culture that embraces each person s diversity, including the extent to which they are similar or different. ABC leaders believe that an equitable and inclusive culture is not only the right thing to do, it is a business imperative. Read more about our commitment to diversity , equality, belonging and inclusion at abcfitness.com ABOUT ABC: ABC Fitness (abcfitness.com) is the premier provider of software and related services for the fitness industry and has built a reputation for excellence in support for clubs and their members. ABC is the trusted provider to boost performance and create a total fitness experience for over 4 1 million members of clubs of all sizes whether a multi-location chain, franchise or an independent gym. Founded in 1981, ABC helps over 31 ,000 gyms and health clubs globally perform better and more profitably offering a comprehensive SaaS club management solution that enables club operators to achieve optimal performance. ABC F itness is a Thoma Bravo portfolio company, a private equity firm focused on investing in software and technology companies (thomabravo.com). #LI- HYBRID If you like wild growth and working with happy, enthusiastic over-achievers, youll enjoy your career with us!

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

Bengaluru

Work from Office

BI Architect Experience: Minimum 5 Years Location: Bengaluru Employment Type: Full-time About Cloudside Cloudside is a Cloud and Data consulting company that has deep expertise in Google Cloud Platform, Amazon Web Services, and Microsoft Azure. We help our customers solve complex problems in Infrastructure, DevOps, Application modernization, and Data Management and Analytics. Our mission is to be the trusted team of Cloud and Data problem solvers, helping our customers achieve the best business outcomes using Cloud as a catalyst. We bring decades of experience and skill in Cloud, Data, and Application Engineering acquired helping customers of all shapes and sizes, and make it available for every customer that we work with. Job Summary: We are seeking a skilled and experienced BI Architect to design, develop, and maintain business intelligence solutions. The ideal candidate should have strong expertise in BI tools, data modeling, and database technologies, with a strategic mindset to support data-driven decision-making across the organization. Key Responsibilities: Design and implement scalable BI solutions using Power BI, QlikView, and Tableau. Create and optimize data models to support reporting and analytics. Work with stakeholders to gather requirements and deliver actionable insights. Develop, test, and maintain SQL queries and stored procedures across SQL Server, PostgreSQL, and PL/SQL environments. Ensure data integrity, accuracy, and security across BI platforms. Provide architectural guidance and best practices for BI and data visualization strategies. Required Skills: Proficiency in Power BI, QlikView, and Tableau. Strong command of SQL, SQL Server, PostgreSQL, and PL/SQL. Expertise in Data Modeling and BI architecture. Excellent analytical and problem-solving skills. Ability to work independently and collaborate with cross-functional teams. Qualifications: Bachelor s or Master s degree in Computer Science, Information Systems, or a related field. Minimum 5 years of relevant experience in BI and data architecture roles.

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 12 Lacs

Pune

Work from Office

Bito believes software development will change dramatically over the next 5-10 years, driven by the incredible capabilities of Generative AI. Software developers will become 10-20 times more productive, and the amount of new software built in the world will increase dramatically. But this puts a huge burden on processes such as code review, testing, and build management, and demands totally new approaches. Bito aims to be a leader in the GenAI tools that will enable this transformation. Bito is building accessible, accurate AI agents trusted by developers across the world. Designed to help software engineers ship faster, better code, Bito offers an AI Code Review Agent with an engine that deeply understands your code in Git or your IDE. Over 100,000 developers already use Bito s tools every month. Bito s AI Code Review Agent enables high quality AI code reviews that cut down human engineering time in pull requests by 50%. Our founders have previously started, built, and taken a company public (NASDAQ: PUBM), worth well over US$1B. We are looking to take our learnings, learn a lot along with you, and do something more exciting this time. This journey will be incredibly rewarding and incredibly difficult! We are building this company with a hybrid working approach (3 days a week in the office), with our main teams for time zone management in San Francisco Bay Area (United States) and in Pune (India). The founders are based in the Bay Area and in Bangalore. Responsibilities: Degree in Computer Science or equivalent and 3+ years of software development experience developing critical applications in a production Linux environment Strong technical skills in design patterns, build tools, GIT, SQL and Linux Strong coding skills with 2 or more programming language, Java/Go is must Good understanding of data structure and algorithms Knowledge of Python or Machine Learning is good to have Strong debugging/diagnostic, problem solving and analytical skills Experience with entire software development lifecycle including requirements, specification, documentation, testing, release, and maintenance Knowledge and prior work experience on data modeling, security, performance, and scalability Ability to work independently and with a team Can learn and implement new technologies quickly and effectively Excellent communication and interpersonal skills Work Location: Pune What do we offer: At Bito, we strive to create a supportive and rewarding work environment that enables our employees to thrive. Join a dynamic team at the forefront of generative AI technology. Hybrid work environment Competitive compensation, including stock options A chance to work in the exciting generative AI space Yearly team offsite event

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

We are seeking a skilled Data Engineer with 3+ years of experience in cloud platforms like GCP or AWS. The ideal candidate will have strong proficiency in Python programming, experience with Kafka for real-time data streaming, and expertise in DBT for data transformation. This role requires working on complex projects involving [mention any specific types of projects or industries]. Requirements Key Responsibilities: Design, develop, and deploy scalable data pipelines and ETL processes on cloud platforms (GCP/AWS). Implement and maintain real-time data streaming solutions using Kafka. Collaborate with cross-functional teams to understand data requirements and deliver robust data solutions. Optimize and tune data pipelines for performance and reliability. Ensure data quality and integrity throughout the data lifecycle. Work on data modeling and schema design using DBT. Provide technical guidance and support to junior team members as needed. Required Skills and Qualifications: Bachelordegree in Computer Science, Engineering, or a related field. 3+ years of professional experience as a Data Engineer or similar role. Proficiency in Python programming and experience with its data libraries. Hands-on experience with cloud platforms such as Google Cloud Platform (GCP) or Amazon Web Services (AWS). Strong knowledge of Kafka for real-time data streaming. Experience with DBT (Data Build Tool) for data modeling and transformation. Ability to work on complex data projects independently and as part of a team. Excellent problem-solving and analytical skills. Effective communication skills with the ability to collaborate across teams. Preferred Qualifications: Masterdegree in Computer Science, Engineering, or a related field. Certifications in relevant cloud technologies (GCP/AWS).

Posted 1 week ago

Apply

8.0 - 13.0 years

50 - 60 Lacs

Hyderabad

Work from Office

As a Sr. Data Engineer in the Sales Automation Engineering team you should be able to work through the different areas of Data Engineering & Data Architecture including the following: Data Migration - From Hive/other DBs to Salesforce/other DBs and vice versa Data Modeling - Understand existing sources & data models and identify the gaps and building future state architecture Data Pipelines - Building Data Pipelines for several Data Mart/Data Warehouse and Reporting requirements Data Governance - Build the framework for DG & Data Quality Profiling & Reporting What the Candidate Will Need / Bonus Points What the Candidate Will Do ---- Demonstrate strong knowledge of and ability to operationalize, leading data technologies and best practices. Collaborate with internal business units and data teams on business requirements, data access, processing/transformation and reporting needs and leverage existing and new tools to provide solutions. Build dimensional data models to support business requirements and reporting needs. Design, build and automate the deployment of data pipelines and applications to support reporting and data requirements. Research and recommend technologies and processes to support rapid scale and future state growth initiatives from the data front. Prioritize business needs, leadership questions, and ad-hoc requests for on-time delivery. Collaborate on architecture and technical design discussions to identify and evaluate high impact process initiatives. Work with the team to implement data governance, access control and identify and reduce security risks. Perform and participate in code reviews, peer inspections and technical design/specifications. Develop performance metrics to establish process success and work cross-functionally to consistently and accurately measure success over time Delivers measurable business process improvements while re-engineering key processes and capabilities and maps to future-state vision Prepare documentations and specifications on detailed design. Be able to work in a globally distributed team in an Agile/Scrum approach. Basic Qualifications ---- Bachelors Degree in computer science or similar technical field of study or equivalent practical experience. 8+ years professional software development experience, including experience in the Data Engineering & Architecture space Interact with product managers, and business stakeholders to understand data needs and help build data infrastructure that scales across the company Very strong SQL skills - know advanced level SQL coding (windows functions, CTEs, dynamic variables, Hierarchical queries, Materialized views etc) Experience with data-driven architecture and systems design knowledge of Hadoop related technologies such as HDFS, Apache Spark, Apache Flink, Hive, and Presto. Good hands on experience with Object Oriented programming languages like Python. Proven experience in large-scale distributed storage and database systems (SQL or NoSQL, e.g. HIVE, MySQL, Cassandra) and data warehousing architecture and data modeling. Working experience in cloud technologies like GCP, AWS, Azure Knowledge of reporting tools like Tableau and/or other BI tools. Preferred Qualifications ---- Python libraries (Apache spark, Scala) Working experience in cloud technologies like GCP, AWS, Azure *Accommodations may be available based on religious and/or medical conditions, or as required by applicable law.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 11 Lacs

Bengaluru

Work from Office

Uber sends billions of messages to our users across channels such as Email, Push, SMS, WhatsApp, and in-app surfaces, through an internally built CRM system. We re looking for a Product Manager to lead the development of marketing measurement and insight-generation tools. This role will focus on enabling clear performance tracking, consistent measurement, and data-driven decision-making empowering teams across Uber to optimize marketing efforts with confidence and speed. What the Candidate Will Do ---- Partner with Marketing, Data Science, Engineering, and other cross-functional teams to deeply understand business needs and define measurement strategies. Drive the product vision, roadmap, and execution Build and refine underlying data processes and pipelines to ensure reliable, high-quality datasets that power measurement and insight generation. Collaborate with Engineering to design, implement, and maintain scalable data systems (e.g., data lakes, ETL frameworks) supporting marketing workflows. Develop intuitive dashboards and analytics tools that surface actionable insights on campaign performance, audience engagement, channel effectiveness, and overall marketing impact. Establish frameworks for consistent marketing measurement, including attribution, incrementality, and experimentation, ensuring alignment across diverse teams and markets. Collaborate with stakeholders to define KPIs, track impact, and foster continuous improvement in data-driven marketing decisions. Champion data governance and best practices so that marketers can trust and confidently act on insights. Basic Qualifications ---- Bachelor s degree in Computer Science, Engineering, Data Science, or a related technical or analytical field. 5+ years of product management experience with a focus on data platforms, analytics, or business intelligence. Strong understanding of marketing measurement, data modeling, and reporting best practices. Experience working with large-scale data infrastructure and tools (e.g., SQL, Looker, BigQuery, Airflow). Ability to translate complex data requirements into simple, user-centric products. Strong cross-functional collaboration and communication skills. Preferred Qualifications ---- Master s degree in a technical field Experience in digital marketing, CRM, or MarTech environments. Familiarity with experimentation and incrementality testing. Interest in applying AI/ML to enhance marketing analytics and insights. *Accommodations may be available based on religious and/or medical conditions, or as required by applicable law.

Posted 1 week ago

Apply

2.0 - 7.0 years

45 - 50 Lacs

Hyderabad

Work from Office

We are looking for a highly driven and experienced Data Scientist to join Uber s FinTech - Data Science team. In this role, you will have the amazing opportunity to shape how Uber understands and optimizes its financial performance across diverse business lines. You ll partner closely with Data Science, Product, Engineering, Finance, and other cross-functional stakeholders on fast-moving, high-stakes problems. A deep analytical and science passion and the ability to execute key business priorities are a must for this role. Your performance is measured by the insights you give, communication effectiveness, and the initiative to drive ideas and implement them into action! What the Candidate Will Need / Bonus Points What the Candidate Will Do ---- Analyze large volumes of financial and operational data to extract actionable insights, with a focus on key financial and business metrics Develop models to forecast financial metrics, detect anomalies, and support strategic decision-making across Uber s financial systems. Partner with stakeholders across Finance, Product, Engineering, and ML teams to design, prototype, and productionize end-to-end data science solutions. Build end-to-end data pipelines and self-serving dashboards. Automate whatever you can! Communicate your findings to cross-functional peers and management. Build tools and documentation that enable operational teams to independently explore financial metrics and monitor key performance indicators. Investigate and resolve discrepancies across multiple financial systems and datasets, ensuring consistency and trust in reported metrics Build effective visualizations to communicate data to key decision-makers Get a deep understanding of the FinTech systems and data flows involved; document and train internal personnel to institutionalize the learnings of the data science practice. Basic Qualifications ---- Bachelors degree with 2+ years, or Master s degree with 1+ years, of relevant industry experience in Data Science or similar roles. Advanced SQL proficiency and strong understanding of data modeling. Solid foundation in statistical methods and data exploration techniques. Experience building dashboards or visualizations using Tableau, Plotly, Looker, or similar platforms. Strong communication skills, with the ability to translate data findings into clear recommendations to cross-functional stakeholders Preferred Qualifications ---- Experience in Python/R for data analysis, modeling, and pipeline automation. Experience developing forecasting or anomaly detection models, ideally in financial or operational domains Demonstrated ability to decompose complex business problems into structured analytical approaches Proven success in cross-functional collaboration and stakeholder engagement. Prior exposure to finance or related industries is a plus *Accommodations may be available based on religious and/or medical conditions, or as required by applicable law.

Posted 1 week ago

Apply

2.0 - 7.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary . In data analysis at PwC, you will focus on utilising advanced analytical techniques to extract insights from large datasets and drive datadriven decisionmaking. You will leverage skills in data manipulation, visualisation, and statistical modelling to support clients in solving complex business problems. Responsibilities B.Tech/M.Tech from a premier institute with hands on design / development experience in building and operating highly available services, ensuring stability, security and scalability 2+ years of software development experience preferably in product companies Proficiency in the latest technologies like Web Component, React/Vue/Bootstrap, Redux, NodeJS, Type Script, Dynamic web applications, Redis, Memcached, Docker, Kafka, MySQL. Deep understanding of MVC framework and concepts like HTML, DOM, CSS, REST, AJAX, responsive design, TestDriven Development. Experience with AWS with knowledge of AWS Services like Autoscaling, ELB, ElastiCache, SQS, SNS, RDS, S3, Serverless Architecture, Lambda, Gateway, and Amazon DynamoDB, etc... or similar technology stack Experience with Operations (AWS, Terraform, scalability, high availability & security) is a big plus Able to define APIs and integrate them into web applications using XML, JSON, SOAP/REST APIs. Knowledge of software fundamentals including design principles & analysis of algorithms, data structure design, and implementation, documentation, and unit testing and the acumen to apply them Ability to work proactively and independently with minimal supervision Mandatory skill sets Java React Node JS HTML/ CSS XML, JSON, SOAP/REST APIs. AWS Preferred skill sets Git CI/CD Docker Kubernetes Unit Testing Years of experience required 4 8 Years Education qualification BE/B.Tech/MBA/MCA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Full Stack Development Accepting Feedback, Accepting Feedback, Active Listening, Algorithm Development, Alteryx (Automation Platform), Analytical Thinking, Analytic Research, Big Data, Business Data Analytics, Communication, Complex Data Analysis, Conducting Research, Creativity, Customer Analysis, Customer Needs Analysis, Dashboard Creation, Data Analysis, Data Analysis Software, Data Collection, DataDriven Insights, Data Integration, Data Integrity, Data Mining, Data Modeling, Data Pipeline {+ 38 more} No

Posted 1 week ago

Apply

3.0 - 7.0 years

12 - 13 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC & Summary s Job Title GenAI S/W Engineer Location Bengaluru, Gurugram Job Type FullTime Full Stack Developer with Gen AI & Cloud Expertise Position Title Full Stack Lead Developer Experience 37 Years Job Overview We are seeking a highly skilled and versatile polyglot Full Stack Developer with expertise in modern frontend and backend technologies, cloudbased solutions, AI/ML and Gen AI. The ideal candidate will have a strong foundation in fullstack development, cloud platforms (preferably Azure), and handson experience in Gen AI, AI and machine learning technologies. Key Responsibilities Develop and maintain web applications using Angular/React.js, .NET, and Python. Design, deploy, and optimize Azure native PaaS and SaaS services, including but not limited to Function Apps, Service Bus, Storage Accounts, SQL Databases, Key vaults, ADF, Data Bricks and REST APIs with Open API specifications. Implement security best practices for data in transit and rest. Authentication best practices SSO, OAuth 2.0 and Auth0. Utilize Python for developing data processing and advanced AI/ML models using libraries like pandas, NumPy, scikitlearn and Langchain, Llamaindex, Azure OpenAI SDK Leverage Agentic frameworks like Crew AI, Autogen etc. Well versed with RAG and Agentic Architecture. Strong in Design patterns Architectural, Data, Object oriented Leverage azure serverless components to build highly scalable and efficient solutions. Create, integrate, and manage workflows using Power Platform, including Power Automate, Power Pages, and SharePoint. Apply expertise in machine learning, deep learning, and Generative AI to solve complex problems. Primary Skills Proficiency in React.js, .NET, and Python. Strong knowledge of Azure Cloud Services, including serverless architectures and data security. Experience with Python Data Analytics libraries o pandas o NumPy o scikitlearn o Matplotlib o Seaborn Experience with Python Generative AI Frameworks o Langchain o LlamaIndex o Crew AI o AutoGen Familiarity with REST API design, Swagger documentation, and authentication best practices. Secondary Skills Experience with Power Platform tools such as Power Automate, Power Pages, and SharePoint integration. Knowledge of Power BI for data visualization (preferred). Preferred Knowledge Areas Nice to have Indepth understanding of Machine Learning, deep learning, supervised, unsupervised algorithms. Mandatory skill sets AI, ML Preferred skill sets AI, ML Years of experience required 37 years Education qualification BE/BTECH, ME/MTECH, MBA, MCA Education Degrees/Field of Study required Bachelor of Technology, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills AI Programming Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 28 more} No

Posted 1 week ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC & Summary The primary purpose of this role is to translate business requirements and functional specifications into logical program designs and to deliver dashboards, schema, data pipelines, and software solutions. Design, develop, and maintain scalable data pipelines to process and transform large volumes of structured and unstructured data. Build and maintain ETL/ELT workflows for data ingestion from various sources (APIs, databases, files, cloud). Ensure data quality, integrity, and governance across the pipeline. This includes developing, configuring, or modifying data components within various complex business and/or enterprise application solutions in various computing environments. s We are currently seeking Sr data engineer, who can perform data integration to build custom data pipeline, manage data transformation, Performance optimization, Automation, Data Governance & data quality. Mandatory skill sets Must have knowledge, skills and experiences GCP Dataproc Pyspark, Spark SQL, Dataflow Apache beam, Cloud composer, Bigquery, API Management Preferred skill sets Good to have knowledge, skills and experiences Experience in building data pipelines Experience in Software lifecycle tools for CI/CD and version control system such as GIT Familiarity with Agile methodologies is a plus. Years of experience required Experience and Qualifications Experience 4 Years to 12 years NP Immediate to 30 days Location Hyderabad 3 days / week work from client office Education qualification o BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education Degrees/Field of Study required Bachelor of Technology, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Data Engineering Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline {+ 27 more} No

Posted 1 week ago

Apply

15.0 - 20.0 years

16 - 20 Lacs

Gurugram, Bengaluru

Work from Office

Role Overview We are seeking a highly skilled and experienced Data Manager to lead the development, governance, and utilization of enterprise data systems. This is a strategic leadership role focused on ensuring seamless and secure flow of data across our platforms and teams, enabling timely and accurate access to actionable insights. The ideal candidate brings a strong foundation in data architecture, governance, and cloud-native systems, combined with hands-on experience managing cross-functional teams and implementing scalable, secure, and cost-efficient data solutions. Your Objectives Optimize data systems and infrastructure to support business intelligence and analytics. Implement best-in-class data governance, quality, and security frameworks. Lead a team of data and software engineers to develop, scale, and maintain cloud-native platforms. Support data-driven decision-making across the enterprise Key Responsibilities Develop and enforce policies for effective and ethical data management. Design and implement secure, efficient processes for data collection, storage, analysis, and sharing. Monitor and enhance data quality, consistency, and lineage. Oversee integration of data from multiple systems and business units. Partner with internal stakeholders to support data needs, dashboards, and ad hoc reporting. Maintain compliance with regulatory frameworks such as GDPR and HIPAA. Troubleshoot data-related issues and implement sustainable resolutions. Ensure digital data systems are secure from breaches and data loss. Evaluate and recommend new data tools, architectures, and technologies. Support documentation using Atlassian tools and develop architectural diagrams. Automate cloud operations using infrastructure as code (e.g., Terraform) and DevOps practices. Facilitate inter-team communication to improve data infrastructure and eliminate silos. Leadership & Strategic Duties Manage, mentor, and grow a high-performing data engineering team. Lead cross-functional collaboration with backend engineers, architects, and product teams. Facilitate partnerships with cloud providers (e.g., AWS) to leverage cutting-edge technologies. Conduct architecture reviews, PR reviews, and drive engineering best practices. Collaborate with business, product, legal, and compliance teams to align data operations with enterprise goals. Required Qualifications Bachelors or Masters degree in Computer Science, Engineering, or related field. 1015 years of experience in enterprise data architecture, governance, or data platform development. Expertise in SQL, data modelling, and modern data tools (e.g., Snowflake, dbt, Fivetran). Deep understanding of AWS cloud services (Lambda, ECS, RDS, DynamoDB, S3, SQS). Proficient in scripting (Python, Bash) and CI/CD pipelines. Demonstrated experience with ETL/ELT orchestration (e.g., Airflow, Prefect). Strong understanding of DevOps, Terraform, containerization, and serverless computing. Solid grasp of data security, compliance, and regulatory requirements Preferred Experience (Healthcare Focused) Experience working in healthcare analytics or data environments. Familiarity with EHR/EMR systems such as Epic, Cerner, Meditech, or Allscripts. Deep understanding of healthcare data privacy, patient information handling, and clinical workflows Soft Skills & Team Fit Strong leadership and mentoring mindset. Ability to manage ambiguity and work effectively in dynamic environments. Excellent verbal and written communication skills with technical and non-technical teams. Passionate about people development, knowledge sharing, and continuous learning. Resilient, empathetic, and strategically focused.

Posted 1 week ago

Apply

7.0 - 12.0 years

11 - 16 Lacs

Gurugram, Bengaluru

Work from Office

This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies.

Posted 1 week ago

Apply

4.0 - 8.0 years

8 - 13 Lacs

Kochi, Ernakulam

Hybrid

Job Summary: We are looking for an experienced and detail-oriented Power BI Developer with 45 years of industry experience to join our analytics and reporting team. The ideal candidate will have a strong background in business intelligence, data visualization, and analytics, along with excellent communication skills and the ability to adapt quickly to dynamic business requirements. Key Responsibilities: Design, develop, and maintain interactive Power BI dashboards and reports. Translate business needs into technical specifications and deliver actionable insights. Connect to various data sources (SQL Server, Excel, cloud services, etc.) and perform data transformations using Power Query and DAX. Collaborate with business stakeholders, analysts, and data engineers to gather requirements and understand key performance metrics. Optimize data models and reports for performance and scalability. Ensure data accuracy, consistency, and security across all reporting solutions. Support ad hoc data analysis and create visual storytelling using data. Stay updated with the latest Power BI features, tools, and best practices. Required Skills: 4–5 years of hands-on experience with Power BI (Power BI Desktop, Power BI Service). Strong proficiency in DAX and Power Query (M language) . Solid experience with data modeling , ETL processes, and building enterprise-level dashboards. Strong SQL skills and experience working with relational databases (e.g., SQL Server, Azure SQL, etc.). Understanding of data warehousing concepts and star/snowflake schema design. Familiarity with Power Platform (Power Apps, Power Automate) is a plus. Basic knowledge of Azure Data Services (e.g., Azure Data Factory, Azure Synapse) is advantageous. Soft Skills: Excellent communication skills – able to clearly explain technical concepts to non-technical stakeholders. Strong analytical thinking and problem-solving abilities. Ability to adapt quickly to changing priorities and business needs. Self-motivated with a proactive attitude and strong ownership mindset. Effective time management and organizational skills. Preferred Qualifications: Bachelor’s degree in Computer Science, Information Technology, Data Science, or a related field. Microsoft Certification in Power BI or related technologies (optional but a plus). Experience working in Agile environments.

Posted 1 week ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

Bengaluru

Work from Office

We are looking for senior data engineer with 5-8 yrs of experience.

Posted 1 week ago

Apply

6.0 - 10.0 years

13 - 17 Lacs

Chennai

Work from Office

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role You act as a contact person for our customers and advise them on data-driven projects. You are responsible for architecture topics and solution scenarios in the areas of Cloud Data Analytics Platform, Data Engineering, Analytics and Reporting. Experience in Cloud and Big Data architecture. Responsibility for designing viable architectures based on Microsoft Azure, AWS, Snowflake, Google (or similar) and implementing analytics. Experience in DevOps, Infrasturcure as a code, DataOps, MLOps. Experience in business development (as well as your support in the proposal process). Data warehousing, data modelling and data integration for enterprise data environments. Experience in design of large scale ETL solutions integrating multiple / heterogeneous systems. Experience in data analysis, modelling (logical and physical data models) and design specific to a data warehouse / Business Intelligence environment (normalized and multi-dimensional modelling). Experience with ETL tools primarily Talend and/or any other Data Integrator tools (Open source / proprietary), extensive experience with SQL and SQL scripting (PL/SQL & SQL query tuning and optimization) for relational databases such as PostgreSQL, Oracle, Microsoft SQL Server and MySQL etc., and on NoSQL like MongoDB and/or document-based databases. Must be detail oriented, highly motivated and work independently with minimal direction. Excellent written, oral and interpersonal communication skills with ability to communicate design solutions to both technical and non-technical audiences. IdeallyExperience in agile methods such as safe, scrum, etc. IdeallyExperience on programming languages like Python, JavaScript, Java/ Scala etc. Your Profile Provides data services for enterprise information strategy solutions - Works with business solutions leaders and teams to collect and translate information requirements into data to develop data-centric solutions. Design and develop modern enterprise data centric solutions (e.g. DWH, Data Lake, Data Lakehouse) Responsible for designing of data governance solutions. What you will love about working here We recognize the significance of flexible work arrangements to provide support . Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.

Posted 1 week ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Chennai

Work from Office

About The Role Business Advisors shape the vision with the client, understand the needs of the users/stakeholders, carry out an elicitation of processes, data and capabilities and derive the target processes and the business requirements for the current and future solution. - Grade Specific Conducts appropriate meetings/workshops to elicit/understand and document the business requirements using their domain expertise. In addition, may also produce process and data models of the current and/or future state. Skills (competencies) Abstract Thinking Active Listening Agile (Software Development Framework) Analytical Thinking Backlog Grooming Business Architecture Modeling Business Process Modeling (e.g. BPMN) Change Management Coaching Collaboration Commercial Acumen Conceptual Data Modeling Conflict Management Confluence Critical Thinking CxO Conversations Data Analysis Data Requirements Management Decision-Making Emotional Intelligence Enterprise Architecture Modelling Facilitation Functional IT Architecture Modelling Giving Feedback Google Cloud Platform (GCP) (Cloud Platform) Influencing Innovation Jira Mediation Mentoring Microsoft Office Motivation Negotiation Networking Power BI Presentation skills Prioritization Problem Solving Project Governance Project Management Project Planning Qlik Relationship-Building Requirements Gathering Risk Management Scope Management SQL Stakeholder Management Story Mapping Storytelling Strategic Management Strategic tThinking SWOT Analysis Systems Requirement Analysis (or Management) Tableau Trusted Advisor UI-Design / Wireframing UML User Journey User Research Verbal Communication Written Communication

Posted 1 week ago

Apply

4.0 - 9.0 years

3 - 7 Lacs

Chennai

Work from Office

About The Role About The Role : Interpret business requirements and translate them into technical specifications. Design, develop, and maintain Qlik Sense dashboards, reports, and data visualizations. Perform data extraction, transformation, and loading (ETL) from various sources. Create and manage QVD files and implement data modeling best practices. Ensure data accuracy and consistency through validation and testing. Optimize Qlik Sense applications for performance and scalability. Collaborate with business analysts, data engineers, and stakeholders. Provide technical support and troubleshoot issues in Qlik Sense applications. Document development processes, data models, and user guides. 4+ years of experience in Qlik Sense development and dashboarding. Strong knowledge of data modeling, set analysis, and scripting in Qlik. Proficiency in SQL and experience with RDBMS like MS SQL Server or Oracle. Familiarity with Qlik Sense integration with web technologies and APIs. Understanding of BI concepts and data warehousing principles. Excellent problem-solving and communication skills. Qlik Sense certification is a plus. - Grade Specific Focus on Industrial Operations Engineering. Develops competency in own area of expertise. Shares expertise and provides guidance and support to others. Interprets clients needs. Completes own role independently or with minimum supervision. Identifies problems and relevant issues in straight forward situations and generates solutions. Contributes in teamwork and interacts with customers.

Posted 1 week ago

Apply

5.0 - 9.0 years

8 - 13 Lacs

Hyderabad

Work from Office

About The Role Role Overview:Develop efficient SQL queries and maintain views, models, and data structures across federated and transactional DB to support analytics and reporting. SQL (Advanced) Python for data exploration and scripting Shell scripting for lightweight automationKey Responsibilities: Write complex SQL queries for data extraction and transformations Build and maintain views, materialized views, and data models Enable efficient federated queries and optimize joins across databases Support performance tuning, indexing, and query optimization effortsPrimary: Expertise in MS SQL Server / Oracle DB / PostgresSQL , Columnar DBs like DuckDB , and federated data access Good understanding of Apache Arrow columnar data format, Flight SQL, Apache Calcite Secondary: Experience with data modelling, ER diagrams, and schema design Familiarity with reporting layer backend (e.g., Power BI datasets) Familiarity with utility operations and power distribution is preferred Experience with cloud-hosted databases is preferred Exposure to data lake in cloud ecosystems is a plusOptional Familiar with Grid CIM (Common Information Model; IEC 61970, IEC 61968) Familiarity with GE ADMS DNOM (Distribution Network Object Model) GE GridOS Data Fabric

Posted 1 week ago

Apply

6.0 - 11.0 years

9 - 14 Lacs

Chennai

Work from Office

Senior Application Developer - Salesforce We re the obstacle overcomers, the problem get-arounders- From figuring it out to getting it done our innovative culture demands yes and how! We are UPS- We are the United Problem Solvers- About Applications Development at UPS Technology: Our Application Development teams use their expertise in programming languages and software design to develop next-generation technology- They are responsible for developing the applications which track and move up to 38 million packages in a single day (4-7 billion annually)- This team works closely with all of our customers to build innovative technologies that are customized to drive our business and provide the ultimate customer experience- As a member of the applications development family, you will help UPS grow and provide valuable services across the globe- About this role: The Senior Application Developer - Salesforce will play a key role in analyzing business requirements and translating them into Salesforce-specific solutions using a combination of out-of-the-box features, configurations, and custom development- This role demands a strong understanding of Sales Cloud , expertise in Salesforce declarative and programmatic approaches , and experience in integrating Salesforce with external systems (legacy and cloud)- You will work closely with product owners, architects, developers, and QA teams across geographies to build scalable, performant, and secure solutions- You are expected to contribute to the entire lifecycle of application development from design and coding to deployment and ongoing enhancements while also ensuring adherence to best practices, development standards, and security guidelines- Additional details: Analyze business requirements and design end-to-end scalable solutions using Salesforce Develop robust, reusable components using Apex, Lightning Web Components (LWC), Aura, Visualforce, and Velocity Implement automation using Salesforce tools such as Flows, Process Builder, Workflow Rules, Approval Processes, and Formulas Design and build integrations between Salesforce and third-party systems using REST/SOAP APIs, JSON/XML, OAuth, and Single Sign-On Ensure development efforts align with platform best practices and Salesforce governor limits- Migrate metadata and changes between environments using tools like Flosum, Jenkins, and ANT Perform data operations and transformations using Data Loader and other native tools- Work collaboratively with the onshore team during design discussions, development tasks, and UAT support- Author and maintain technical documentation including design specs and deployment runbooks- Support defect resolution, performance tuning, and production issue triaging- Minimum Qualifications: Minimum of 6+ years of Salesforce development experience, with at least 4+ years of hands-on experience working with Lightning Components (Aura and LWC) Strong expertise in Sales Cloud implementation and business processes (e-g-, Lead to Opportunity, Quote, Task Management, Sales Path, Record Types, Custom Buttons, Validation Rules, Forecasting) Excellent skills in Apex classes, Apex triggers, SOQL/SOSL, Lightning Web Components (LWC), Aura Components, and Visualforce Deep understanding of Salesforce configuration (objects, page layouts, flows, profiles, permission sets, etc-) Experience in integrating Salesforce with legacy and external systems using SOAP and REST APIs, including JSON and XML handling Hands-on experience working with 3rd-party components from AppExchange, content management, and multilingual support using Salesforce Translations Familiarity with platform deployment tools like Jenkins, ANT, and Flosum Good understanding of data modeling, object-oriented principles, JavaScript, and HTML Knowledge of Agile development methodologies and collaboration tools like Azure DevOps (ADO) Strong debugging and analytical skills for issue resolution Excellent communication and documentation skills Nice to Have Experience working with Azure DevOps (ADO) Experience working with Flosum- Certifications (Mandatory) Salesforce Platform Developer I Salesforce Platform Developer II Salesforce Platform App Builder Soft Skills Strong collaboration and interpersonal skills, with the ability to work effectively in a distributed team environment- Self-driven, detail-oriented, and passionate about learning new technologies- Ability to multi-task and manage priorities in a dynamic Agile environment- Strong problem-solving skills and a proactive approach to issue resolution

Posted 1 week ago

Apply

11.0 - 14.0 years

35 - 40 Lacs

Chennai

Work from Office

This position influences the development and implementation of Information Technology (I-T-) strategy, initiatives, and governing policies- He/She assembles detailed reviews of the enterprise and documents capabilities and conceives approaches to aligning technical solutions with business needs- This position assists in defining the direction for projects and solution architecture- This position plans and champions the execution of broad initiatives aimed at delivering value to internal and external stakeholders- He/She leverages data, technical, and business knowledge to drive the development of capability frameworks at portfolio and enterprise levels- This position is involved throughout the project life cycle with emphasis on the initiation, feasibility, and analysis phases- Identify and design API layer for service registry, management, throttling, routing etc- Design security and authentication features in compliance with company policies Design system with the right mix of Monitoring, Alerting and Tracing Define Templates for Development Teams and perform regular code reviews to ensure best practices are followed- Design Integration layer using a combination of approaches (services, messaging etc-) to support downstream data flow and also work in concert with existing systems- Develop services using technologies like but not limited to Spring Boot Node Define strategy for Test Driven Development to ensure requirements coverage- Prepare documentation where necessary, including training, process flows, system structure, etc- Technical Skills: API Architecture Understanding: Familiarity with RESTful APIs, GraphQL, SOAP, and other API types- OAS (OpenAPI Specification) Expertise: Ability to read, write, and evaluate API specifications- Knowledge of Protocols: Understanding of HTTP, HTTPS, and possibly other communication protocols- Data Modeling: Proficiency in understanding and evaluating data schemas, like JSON Schema, YAML or SOAP/XML Schema- Event-Driven Architecture: Understand message queues, event streams, and other Security Protocols: Understanding of OAuth, JWT, API keys, and other authentication and authorization mechanisms- Rate Limiting and Throttling: Knowledge of how these policies affect API usage and performance- Development Tools: Familiarity with tools like VSCode, Spectral, Stoplight Studio, or other IDEs and API documentation and testing tools- Significant understanding of Microservice patterns concepts their application to application design and business solutions Soft Skills: Communication: Ability to articulate complex technical scenarios in a straightforward manner to stakeholders at different levels- Critical Thinking: Evaluation of design decisions, trade-offs, and potential future challenges- Attention to Detail: Especially crucial for analyzing API documentation, error messages, and data models- Negotiation Skills: Often governance reviews involve negotiations on standards, practices, or resource allocation- Teamwork: Ability to collaborate with architects, developers, QA, and other roles involved in API development-

Posted 1 week ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Chennai

Work from Office

We are seeking a highly motivated and skilled IBM DataStage Developer to join our dynamic team- In this role, you will be responsible for designing, developing, implementing, and maintaining robust ETL (Extract, Transform, Load) solutions using IBM InfoSphere DataStage- You will play a crucial role in ensuring the quality, integrity, and timely delivery of data to support our business intelligence and analytics initiatives- Responsibilities: Design, develop, and implement ETL processes using IBM InfoSphere DataStage to extract data from various source systems, transform it according to business rules, and load it into target data warehouses and data marts- Develop and maintain DataStage jobs, sequences, and workflows, ensuring optimal performance and scalability- Troubleshoot and resolve data integration issues, working closely with data analysts and other stakeholders- Write and execute complex SQL queries to analyze data, profile data sources, and validate ETL processes- Collaborate with business analysts and data architects to understand data requirements and translate them into technical specifications- Participate in the full software development lifecycle, including requirements gathering, design, development, testing, deployment, and documentation- Ensure adherence to data governance policies and standards- Monitor ETL processes and proactively identify and address potential performance bottlenecks- Contribute to the development and maintenance of technical documentation, including design specifications, mapping documents, and operational procedures- Required Skills and Experience: Minimum of 3 to 5 years of hands-on experience in designing, developing, and implementing ETL solutions using IBM InfoSphere DataStage- Strong understanding of ETL concepts, methodologies, and best practices- Proficiency in writing complex SQL queries and working with relational databases, preferably Oracle- Experience with data modeling concepts (e-g-, dimensional modeling, star schema, snowflake schema)- Familiarity with Linux operating systems and basic Linux commands is a plus- Exposure to OBIEE (Oracle Business Intelligence Enterprise Edition) or other business intelligence tools is a plus- Excellent written and verbal communication skills, with the ability to effectively communicate technical concepts to both technical and non-technical audiences- Excellent analytical skills and technical skills, with a strong attention to detail and problem-solving abilities- Bachelors degree in Computer Science or equivalent in education and work experience- Preferred Skills: Preferred Certifications IBM Certified Solution Developer - InfoSphere DataStage Experience with other ETL tools or data integration technologies- Knowledge of scripting languages such as Shell scripting or Python- Experience with cloud-based data warehousing solutions (e-g-, Google BigQuery, Azure Synapse)- Familiarity with data quality tools and processes-

Posted 1 week ago

Apply

5.0 - 10.0 years

13 - 14 Lacs

Chennai

Work from Office

Perform systems analysis and design- Design and develop moderate to complex applications- Develops and ensures creation of application documents- Defines and produces integration builds- Monitors emerging technology trends- Primary Skills Power BI Proficient Skilled in SQL (TSQL, PL/SQL is a plus) Microsoft SSRS (Nice to have to translate existing reports) Data Analysis Background (may lesson interaction required with the Business Unit) Data modeling - building relationships, calculated columns, and star/snowflake schemas in Power BI Report design - crafting interactive visuals, drill-throughs, bookmarks, and tooltips Understanding business needs (Business Analysis skills)- translating vague requests into meaningful metrics Performance tuning - optimizing dataset sizes for smoother experiences Understanding of data structures, programming logic, and design Understanding of application design patterns Excellent written & verbal communication skills Excellent attention to detail Qualifications 5+ years of experience Bachelor s Degree or International equivalent

Posted 1 week ago

Apply

5.0 - 10.0 years

50 - 55 Lacs

Bengaluru

Work from Office

You have the opportunity to unleash your full potential at a world-renowned company and take the lead in shaping the future of technology- As a Senior Manager of Data Engineering at JPMorgan Chase within Asset and Wealth Management, you serve in a leadership role by providing technical coaching and advisory for multiple technical teams, as well as anticipate the needs and potential dependencies of other data users within the firm- As an expert in your field, your insights influence budget and technical considerations to advance operational efficiencies and functionalities Job responsibilities Architects the design of complex data solutions that meet diverse business needs and customer requirements and guides the evolution of logical and physical data models to support emerging business use cases and technological advancements- Builds and manages end-to-end cloud-native data pipelines in AWS, leveraging hands-on expertise with AWS components, analytical systems from the ground up, providing architectural direction, translating business issues into specific requirements, and identifying appropriate data to support solutions- Works across the Service Delivery Lifecycle on engineering major/minor enhancements, ongoing maintenance of existing applications and conducts feasibility studies, capacity planning, and process redesign/re-engineering of complex integration solutions- Helps others build code to extract raw data, coach the team on techniques to validate its quality, and apply deep data knowledge to ensure the correct data is ingested across the pipeline- Guides the development of data tools used to transform, manage, and access data, and advise the team on writing and validating code to test the storage and availability of data platforms for resilience- Oversees the implementation of performance monitoring protocols across data pipelines, data accessibility within assigned pipelines, coaches the team on building visualizations and aggregations to monitor pipeline health, implements solutions and self-healing processes that minimize points of failure across multiple product features- Prepares team members for meetings with appropriate stakeholders across teams, addresses concerns around data requirements by providing guidance on feature estimation and leverages expertise to mentor and enhance team capabilities- Collects, refines, and transforms data accurately from diverse sources using advanced SQL queries and Alteryx expertise- Designs, develops and manages dynamic data visualization solutions like Tableau and ThoughtSpot providing actionable insights for informed decision-making- Publishes and manages dashboards, reports with optimized scheduling, addressing data discrepancies and performance issues proactively- Defines critical data scope within products, documenting, classifying, and enriching data with comprehensive metadata for effective use- Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 5+ years applied experience- In addition, 2+ years of experience leading technologists to manage and solve complex technical items within your domain of expertise- Extensive experience in managing the full lifecycle of data, from collection and storage to analysis and reporting- Experienced in SQL, with the ability to design and optimize complex queries and database structures- Deep understanding of NoSQL databases and their strategic applications within the industry- Proven track record in statistical data analysis and the ability to derive actionable insights from complex data sets- Experience in leading large-scale data engineering projects and implementing custom solutions to meet business objectives- Experience in analytics/business intelligence to deliver data-driven insights and strategic solutions, with mandatory hands-on expertise in Alteryx, SQL, Tableau for advanced analytics, complex data manipulations, and crafting advanced data visualizations- Demonstrated ability to build and manage cloud-native data pipelines in AWS, with hands-on knowledge of AWS components- Preferred qualifications, capabilities, and skills Proficient in Python to effectively meet future and evolving data needs, while adeptly tackling complex data logic challenges and designing sophisticated workflows for problem-solving- Drive projects efficiently using extensive experience with tools like JIRA and Confluence, demonstrating agility and adaptability to transition swiftly between projects and meet evolving demands- Exhibit exceptional written and verbal communication skills to articulate complex ideas clearly and persuasively to diverse audiences with assertive communication to set and manage stakeholder expectations under tight deadlines- Extensive experience with major cloud platforms such as AWS, Azure, and Snowflake- Proficiency in ETL tools, including PySpark, Snowflake, and other data processing frameworks- Strong understanding of Data Mesh, data modeling, and domain-driven design principles- Experience with version control systems and tools, including Git, GitHub, GitLab, and Bitbucket

Posted 1 week ago

Apply

10.0 - 15.0 years

50 - 55 Lacs

Hyderabad

Work from Office

As a Senior Manager , you ll lead a team of talented engineers in designing and building trusted, scalable systems that capture, process, and surface rich product signals for use across analytics, AI/ML, and customer-facing features- You ll guide architectural decisions, drive cross-functional alignment, and shape strategy around semantic layers, knowledge graphs, and metrics frameworks that help teams publish and consume meaningful insights with ease- We re looking for a strategic, systems-minded leader who thrives in ambiguity, excels at cross-org collaboration, and has a strong technical foundation to drive business and product impact- What You ll Do Lead and grow a high-performing engineering team focused on batch and streaming data pipelines using technologies like Spark, Trino, Flink, and DBT Define and drive the vision for intuitive, scalable metrics frameworks and a robust semantic signal layer Partner closely with product, analytics, and engineering stakeholders to align schemas, models, and data usage patterns across the org Set engineering direction and best practices for building reliable, observable, and testable data systems Mentor and guide engineers in both technical execution and career development Contribute to long-term strategy around data governance, AI-readiness, and intelligent system design Serve as a thought leader and connector across domains to ensure data products deliver clear, trusted value What We re Looking For 10+ years of experience in data engineering or backend systems, with at least 2+ years in technical leadership or management roles Strong hands-on technical background, with deep experience in big data frameworks (e-g-, Spark, Trino/Presto, DBT) Familiarity with streaming technologies such as Flink or Kafka Solid understanding of semantic layers, data modeling, and metrics systems Proven success leading teams that build data products or platforms at scale Experience with cloud infrastructure (especially AWS S3, EMR, ECS, IAM) Exposure to modern metadata platforms, Snowflake, or knowledge graphs is a plus Excellent communication and stakeholder management skills A strategic, pragmatic thinker who is comfortable making high-impact decisions amid complexity

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies