Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
7 - 12 Lacs
Pune
Work from Office
Role & responsibilities Proficient: Languages/Framework: Fast API, Azure UI Search API (React) tabases and ETL: Cosmos DB (API for MongoDB), Data Factory Data Bricks Proficiency in Python and R Cloud: Azure Cloud Basics (Azure DevOps) Gitlab: Gitlab Pipeline o Ansible and REX: Rex Deployment Data Science: Prompt Engineering + Modern Testing Data mining and cleaning ML (Supervised/unsupervised learning) NLP techniques, knowledge of Deep Learning techniques include RNN, transformers End-to-end AI solution delivery AI integration and deployment AI frameworks (PyTorch) MLOps frameworks Model deployment processes Data pipeline monitoring Expert: (in addition to proficient skills) Languages/Framework: Azure Open AI Data Science: Open AI GPT Family of models 4o/4/3, Embeddings + Vector Search Databases and ETL: Azure Storage Account Expertise in machine learning algorithms (supervised, unsupervised, reinforcement learning) Proficiency in deep learning frameworks (TensorFlow, PyTorch) Strong mathematical foundation (linear algebra, calculus, probability, statistics) Research methodology and experimental design Proficiency in data analysis tools (Pandas, NumPy, SQL) Strong statistical and probabilistic modelling skills Data visualization skills (Matplotlib, Seaborn, Tableau) Knowledge of big data technologies (Spark, Hive) Experience with AI-driven analytics and decision-making systems Note: ***Notice Period should not be more than 10-15 days. ***
Posted 2 months ago
4.0 - 9.0 years
6 - 10 Lacs
Kolkata
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build whats next for their businesses. Your role Develop and maintain data pipelines tailored to Azure environments, ensuring security and compliance with client data standards. Collaborate with cross-functional teams to gather data requirements, translate them into technical specifications, and develop data models. Leverage Python libraries for data handling, enhancing processing efficiency and robustness. Ensure SQL workflows meet client performance standards and handle large data volumes effectively. Build and maintain reliable ETL pipelines, supporting full and incremental loads and ensuring data integrity and scalability in ETL processes. Implement CI/CD pipelines for automated deployment and testing of data solutions. Optimize and tune data workflows and processes to ensure high performance and reliability. Monitor, troubleshoot, and optimize data processes for performance and reliability. Document data infrastructure, workflows, and maintain industry knowledge in data engineering and cloud tech. Your Profile Bachelors degree in computer science, Information Systems, or a related field 4+ years of data engineering experience with a strong focus on Azure data services for client-centric solutions. Extensive expertise in Azure Synapse, Data Lake Storage, Data Factory, Databricks, and Blob Storage, ensuring secure, compliant data handling for clients. Good interpersonal communication skills Skilled in designing and maintaining scalable data pipelines tailored to client needs in Azure environments. Proficient in SQL and PL/SQL for complex data processing and client-specific analytics. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of 22.5 billion.
Posted 2 months ago
5.0 - 8.0 years
3 - 7 Lacs
Kolkata
Work from Office
Execute high-quality visual designs for various documentation projects in PPT & Word. Collaborate with senior designers, team members, and other streams to understand design requirements and contribute effectively to design projects. Stay up-to-date with industry trends, software updates, and new technologies to improve your efficiency and productivity. Adapt designs based on feedback from peers, senior designers, and stakeholders to refine and enhance the final product. Assist in the preparation of design presentations, and word to communicate ideas effectively to clients or internal stakeholders. Ensure that all visual designs are of high quality and adhere to design standards and guidelines. Maintain a high level of deliverables in accordance with directives from the stream lead or design lead. Prioritize tasks and manage your workload efficiently to meet project deadlines while maintaining design quality. Primary Skill Ability to manage multiple projects simultaneously under tight deadlines. Microsoft Word and PowerPoint proficiency is preferred, although having working knowledge of MS Excel, Photoshop, Illustrator, InDesign, is also beneficial. Secondary Skill Experience working with advertising agencies or branding companies. Good communication skills. Skills (competencies)
Posted 2 months ago
6.0 - 11.0 years
8 - 16 Lacs
Hyderabad, Pune, Chennai
Hybrid
Data Engineer having good experience on Azure Databricks and Python Must Have Databricks Python Azure Good to have ADF Candidate must be proficient in Databricks
Posted 2 months ago
3.0 years
5 - 25 Lacs
Chennai, Tamil Nadu, India
On-site
Location: Bangalore, Pune, Chennai, Kolkata ,Gurugram Experience:5-15 Years Work Mode: Hybrid Mandatory Skills: Snowflake/Azure Data Factory/ PySpark / Databricks Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data,circleci,sql,snowflake,etl,databricks,airflow,terraform,git,python,pipelines,azure datafactory,projects,reporting,pyspark,pl/sql,unix shell scripting,rdbms,data warehouse,azure data factory,nosql,skills
Posted 2 months ago
3.0 years
5 - 25 Lacs
Gurugram, Haryana, India
On-site
Location: Bangalore, Pune, Chennai, Kolkata ,Gurugram Experience:5-15 Years Work Mode: Hybrid Mandatory Skills: Snowflake/Azure Data Factory/ PySpark / Databricks Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data,circleci,sql,snowflake,etl,databricks,airflow,terraform,git,python,pipelines,azure datafactory,projects,reporting,pyspark,pl/sql,unix shell scripting,rdbms,data warehouse,azure data factory,nosql,skills
Posted 2 months ago
3.0 years
5 - 25 Lacs
Greater Kolkata Area
On-site
Location: Bangalore, Pune, Chennai, Kolkata ,Gurugram Experience:5-15 Years Work Mode: Hybrid Mandatory Skills: Snowflake/Azure Data Factory/ PySpark / Databricks Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data,circleci,sql,snowflake,etl,databricks,airflow,terraform,git,python,pipelines,azure datafactory,projects,reporting,pyspark,pl/sql,unix shell scripting,rdbms,data warehouse,azure data factory,nosql,skills
Posted 2 months ago
1.0 - 4.0 years
1 - 3 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Cloud Data Engineer Job Title : Cloud Data Engineer Location : Chennai, Hyderabad, Bangalore Experience : 1-4 Job Summary The Cloud Data Engineer designs and builds scalable data pipelines and architectures in cloud environments. This role supports analytics, machine learning, and business intelligence initiatives by ensuring reliable data flow and transformation. Key Responsibilities Develop and maintain ETL/ELT pipelines using cloud-native tools. Design data models and storage solutions optimized for performance and scalability. Integrate data from various sources (APIs, databases, streaming platforms). Ensure data quality, consistency, and security across pipelines. Collaborate with data scientists, analysts, and business teams. Monitor and troubleshoot data workflows and infrastructure. Automate data engineering tasks using scripting and orchestration tools. Required Skills Experience with cloud data platforms (AWS Glue, Azure Data Factory, Google Cloud Dataflow). Proficiency in SQL and programming languages (Python, Scala, Java). Knowledge of big data technologies (Spark, Hadoop, Kafka). Familiarity with data warehousing solutions (Redshift, BigQuery, Snowflake). Understanding of data governance, privacy, and compliance standards. Qualifications Bachelors or Masters degree in Computer Science, Data Engineering, or related field. 3+ years of experience in data engineering, preferably in cloud environments. Certifications in cloud data engineering (e.g., Google Professional Data Engineer, AWS Data Analytics Specialty) are a plus.
Posted 2 months ago
12.0 years
10 - 45 Lacs
Pune, Maharashtra, India
On-site
Location: Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Note-Interview mode -Face 2 Face Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data pipeline,azure,design,data,azure datafactory,architect,data lakes,data warehouse,pyspark,skills,airflow,pipelines,azure databricks,etl,data engineering,python,architect designing,sql,azure synapse
Posted 2 months ago
7.0 - 12.0 years
8 - 18 Lacs
Navi Mumbai, Pune
Work from Office
Interested candidates kindly submit the below form: https://forms.gle/ex1M5oa3qagMUcrn6 Job Description: We are looking for an experienced Team Lead Data Warehouse Migration, Data Engineering & BI to lead enterprise-level data transformation initiatives. The ideal candidate will have deep expertise in migration , Snowflake , Power BI and end-to-end data engineering using tools like Azure Data Factory , Databricks , and PySpark . Key Responsibilities: Lead and manage data warehouse migration projects , including extraction, transformation, and loading (ETL/ELT) across legacy and modern platforms. Architect and implement scalable Snowflake data warehousing solutions for analytics and reporting. Develop and schedule robust data pipelines using Azure Data Factory and Databricks . Write efficient and maintainable PySpark code for batch and real-time data processing. Design and develop dashboards and reports using Power BI to support business insights. Ensure data accuracy, security, and consistency throughout the project lifecycle. Collaborate with stakeholders to understand data and reporting requirements. Mentor and lead a team of data engineers and BI developers. Manage project timelines, deliverables, and team performance effectively Must-Have Skills: Data Migration: Hands-on experience with large-scale data migration, reconciliation, and transformation. Snowflake: Data modeling, performance tuning, ELT/ETL development, role-based access control. Azure Data Factory: Pipeline development, integration services, linked services. Databricks: Spark SQL, notebooks, cluster management, orchestration. PySpark: Advanced transformations, error handling, and optimization techniques. Power BI: Data visualization, DAX, Power Query, dashboard/report publishing and maintenance. Preferred Skills: Familiarity with Agile methodologies and sprint-based development. Experience in working with CI/CD for data workflows. Ability to lead client discussions and manage stakeholder expectations. Strong analytical and problem-solving abilities.
Posted 2 months ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Lead Data Engineer Data Management Job description Company Overview Accordion works at the intersection of sponsors and management teams throughout every stage of the investment lifecycle, providing hands-on, execution-focused support to elevate data and analytics capabilities. So, what does it mean to work at Accordion? It means joining 1,000+ analytics, data science, finance & technology experts in a high-growth, agile, and entrepreneurial environment while transforming how portfolio companies drive value. It also means making your mark on Accordions futureby embracing a culture rooted in collaboration and a firm-wide commitment to building something great, together. Headquartered in New York City with 10 offices worldwide, Accordion invites you to join our journey. Data & Analytics (Accordion | Data & Analytics) Accordion's Data & Analytics (D&A) team delivers cutting-edge, intelligent solutions to a global clientele, leveraging a blend of domain knowledge, sophisticated technology tools, and deep analytics capabilities to tackle complex business challenges. We partner with Private Equity clients and their Portfolio Companies across diverse sectors, including Retail, CPG, Healthcare, Media & Entertainment, Technology, and Logistics. D&A team delivers data and analytical solutions designed to streamline reporting capabilities and enhance business insights across vast and complex data sets ranging from Sales, Operations, Marketing, Pricing, Customer Strategies, and more. Location: Hyderabad Role Overview: Accordion is looking for Lead Data Engineer. He/she will be responsible for the design, development, configuration/deployment, and maintenance of the above technology stack. He/she must have in-depth understanding of various tools & technologies in the above domain to design and implement robust and scalable solutions which address client current and future requirements at optimal costs. The Lead Data Engineer should be able to evaluate existing architectures and recommend way to upgrade and improve the performance of architectures both on-premises and cloud-based solutions. A successful Lead Data Engineer should possess strong working business knowledge, familiarity with multiple tools and techniques along with industry standards and best practices in Business Intelligence and Data Warehousing environment. He/she should have strong organizational, critical thinking, and communication skills. What You will do: Partners with clients to understand their business and create comprehensive business requirements. Develops end-to-end Business Intelligence framework based on requirements including recommending appropriate architecture (on-premises or cloud), analytics and reporting. Works closely with the business and technology teams to guide in solution development and implementation. Work closely with the business teams to arrive at methodologies to develop KPIs and Metrics. Work with Project Manager in developing and executing project plans within assigned schedule and timeline. Develop standard reports and functional dashboards based on business requirements. Conduct training programs and knowledge transfer sessions to junior developers when needed. Recommend improvements to provide optimum reporting solutions. Curiosity to learn new tools and technologies to provide futuristic solutions for clients. Ideally, you have: Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred. More than 5 years of experience in related field. Proven expertise in SSIS, SSAS and SSRS (MSBI Suite.) In-depth knowledge of databases (SQL Server, MySQL, Oracle etc.) and data warehouse (any one of Azure Synapse, AWS Redshift, Google BigQuery, Snowflake etc.) In-depth knowledge of business intelligence tools (any one of Power BI, Tableau, Qlik, DOMO, Looker etc.) Good understanding of Azure (OR) AWS: Azure (Data Factory & Pipelines, SQL Database & Managed Instances, DevOps, Logic Apps, Analysis Services) or AWS (Glue, Aurora Database, Dynamo Database, Redshift, QuickSight). Proven abilities to take on initiative and be innovative. Analytical mind with problem solving attitude. Why Explore a Career at Accordion: High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility. Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes. Entrepreneurial Environment : Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities. Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve. Other benefits for full time employees: Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctors consultations, counsellors, etc. Corporate Meal card options for ease of use and tax benefits. Team lunches, company sponsored team outings and celebrations. Cab reimbursement for women employees beyond a certain time of the day. Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests. Reward and recognition platform to celebrate professional and personal milestones. A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.
Posted 2 months ago
8.0 - 13.0 years
2 - 30 Lacs
Gurugram
Work from Office
About The Role Seeking a highly skilled Senior Data Engineer with 8 years of experience to join our dynamic team Requirements Experienced in architecting, building and maintaining end-to-end data pipelines using Python and Spark in Databricks Proficient in designing and implementing scalable data lake and data warehouse solutions on Azure including Azure Data Lake, Data Factory, Synapse and Azure SQL Hands on experience in leading the integration of complex data sources and the development of efficient ETL processes Champion best practices in data governance, data quality and data security across the organization Adept in collaborating closely with data scientists, analysts and business stakeholders to deliver high-impact data solutions
Posted 2 months ago
5.0 - 10.0 years
25 - 30 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Develop and maintain data pipelines, ETL/ELT processes, and workflows to ensure the seamless integration and transformation of data. Architect, implement, and optimize scalable data solutions. Required Candidate profile Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver actionable insights. Partner with cloud architects and DevOps teams
Posted 2 months ago
8.0 - 13.0 years
20 - 25 Lacs
Pune
Remote
Design Databases & Data Warehouses, Power BI Solutions, Support Enterprise Business Intelligence, Strong Team Player & Contributor, Continuous Improvement Experience in SQL, Oracle, SSIS/SSRS, Azure, ADF, CI/CD, Power BI, DAX, Microsoft Fabric Required Candidate profile Source system data structures, data extraction, data transformation, warehousing, DB administration, query development, and Power BI. Develop WORK FROM HOME
Posted 2 months ago
12.0 - 15.0 years
11 - 16 Lacs
Bengaluru
Work from Office
: Key Responsibilities: Own, manage and prioritize requirements in the product life cycle from definition to phase-out. Define platform requirements for native, on-premise, and cloud deployments. Provide clear direction, context, and priorities to development teams. Collaborate closely with key internal stakeholders and engage with external stakeholders. Focus Areas: Must - Healthcare market. Product knowhow and customer understanding. Must - Sound knowledge of Clinical Workflows and Healthcare IT, especially in the area of Radiology. Must - Healthcare Industry standards like DICOM and IHE. Must - Good understanding of software systems categorized as Medical Device. Must - Basic understanding of Legal regulations and standards applicable for medical devices, affecting safety aspects(i.e. FDA 21CFR820QSR, ISO 13485). Must - Platform Scalability & ModernizationEnable flexible architecture supporting hybrid cloud, containerization, and orchestration (e.g., Kubernetes). Must - Azure ExpertiseDeep knowledge of Azure services (Data Lake Storage, SQL, Data Factory, Synapse) and cloud cost management. Must - Data Lake ArchitectureProficient in data ingestion, storage formats (Parquet, Delta Lake), and multi-zone design (raw, curated, analytics). Nice to have - SQL & DatabasesStrong SQL skills with experience in database design, optimization, and complex queries. Nice to have - Qlik BI ToolsSkilled in Qlik Sense/QlikView for data modeling, transformation, and dashboard/report development. Nice to have - Exposure to agile methodology What are my tasks Gather, prioritize, create & communicate stakeholder and market requirements & S/W specifications Guide and support development teams, resolving conflicts and answering questions Manage all the Agile methodology practices related to requirements engineering and product definition Provide input to project management and support rollout activities such as training, presentations, and workshops What do I need to know to qualify for this job QualificationA Bachelors / masters degree in engineering and / or MCA or equivalent. Work Experience12 to 15 years
Posted 2 months ago
8.0 - 13.0 years
5 - 9 Lacs
Hyderabad
Work from Office
1. Data Engineer Azure Data Services 2. Data Modelling NO SQL and SQL 3. Good understanding of Spark, Spark stream 4. Hands on with Python / Pandas / Data Factory / Cosmos DB / Data Bricks / Event Hubs / Stream Analytics 5. Knowledge of medallion architecture, data vaults, data marts etc. 6. Preferably Azure Data associate exam certified.
Posted 2 months ago
12.0 - 17.0 years
13 - 18 Lacs
Hyderabad
Work from Office
1. Data Engineer Azure Data Services 2. Data Modelling NO SQL and SQL 3. Good understanding of Spark, park stream 4. Hands on with Python Pandas / Data Factory / Cosmos DB / Data Bricks / Event Hubs / Stream Analytics 5. Knowledge of medallion architecture, data vaults, data marts etc. 6. Preferably Azure Data associate exam certified.
Posted 2 months ago
7.0 - 12.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Data Modeller JD: We are seeking a skilled Data Modeller to join our Corporate Banking team. The ideal candidate will have a strong background in creating data models for various banking services, including Current Account Savings Account (CASA), Loans, and Credit Services. This role involves collaborating with the Data Architect to define data model structures within a data mesh environment and coordinating with multiple departments to ensure cohesive data management practices. Data Modelling: oDesign and develop data models for CASA, Loan, and Credit Services, ensuring they meet business requirements and compliance standards. Create conceptual, logical, and physical data models that support the bank's strategic objectives. Ensure data models are optimized for performance, security, and scalability to support business operations and analytics. Collaboration with Data Architect: Work closely with the Data Architect to establish the overall data architecture strategy and framework. Contribute to the definition of data model structures within a data mesh environment. Data Quality and Governance: Ensure data quality and integrity in the data models by implementing best practices in data governance. Assist in the establishment of data management policies and standards. Conduct regular data audits and reviews to ensure data accuracy and consistency across systems. Data Modelling ToolsERwin, IBM InfoSphere Data Architect, Oracle Data Modeler, Microsoft Visio, or similar tools. DatabasesSQL, Oracle, MySQL, MS SQL Server, PostgreSQL, Neo4j Graph Data Warehousing TechnologiesSnowflake, Teradata, or similar. ETL ToolsInformatica, Talend, Apache NiFi, Microsoft SSIS, or similar. Big Data TechnologiesHadoop, Spark (optional but preferred). TechnologiesExperience with data modelling on cloud platforms Microsoft Azure (Synapse, Data Factory)
Posted 2 months ago
7.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Information Lifecycle management ILM Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : BE Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the application development process- Ensure timely project delivery- Provide guidance and support to team members Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Information Lifecycle management ILM- Strong understanding of data lifecycle management- Experience in data archiving and retention policies- Knowledge of SAP data management solutions- Hands-on experience in SAP data migration- Experience in SAP data governance Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP Information Lifecycle management ILM- This position is based at our Hyderabad office- A BE degree is required Qualification BE
Posted 2 months ago
7.0 - 12.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Must have a Computer Science, Software Engineering, Engineering related degree or equivalent, with 7-10 years experience including a minimum of 5 years experience in a similar role, ideally in multisite hybrid/cloud first environment. Hands-on development experience coding in Python (mandatory) Hands-on development experience with NOSQL (preferable Cosmos) Extensive experience and knowledge with Azure (Azure Cosmos DB, Azure functions, Pipelines, Devops, Kubernetes, Storage) Mandatory Must have excellent knowledge of Microsoft Azure products and how to implement themEnterprise Apps, Azure Functions, Cosmos DB, Containers, Event Grid, Logic Apps, Service Bus, Data Factory Must have hands-on experience of object-oriented development and applied its principles in multiple solutions designDomain Driven, Tiered Applications, Micro-services Must have good knowledge of .NET, C# the standard libraries as well as JavaScript (in a Vue.js context). Knowledge of other development and scripting languages is appreciated (Python, Java, Typescript, PowerShell, bash) Must have knowledge of development and deployment tools (IaC, git, docker etc). Comfortable working with API, webhooks, data transfer, workflow technologies. Deep Understanding of software architecture and the long-term implication of architectural choices. Familiar with Agile project management and continuous delivery.
Posted 2 months ago
3.0 - 7.0 years
7 - 11 Lacs
Bengaluru
Work from Office
A proven and credible practitioner, your deep solution experience will help you lead a team of go-to subject matter experts. Fostering a culture of candour, collaboration, and growth-mindedness, you'll ensure co-creation across IBM Sales and client teams that drive investment in, and adoption of IBM's strategic platforms. Overseeing your teams' fusion of innovative solutions with modern IT architectures, integrated solutions, and offerings, you'll ensure they're helping to solve some of their clients most complex business challenges. We're committed to success. In this role, your achievements will drive your career, team, and clients to thrive. A typical day may involve: Strategic Team Leadership: Leading a team of technical sales experts to co-create innovative solutions with clients. Partnership and Prototype Excellence: Collaborating with IBM and partners to deliver compelling prototypes. Optimizing Resource Utilization: Promoting maximum use of IBM's Technology Sales resources. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 2 months ago
4.0 - 8.0 years
10 - 15 Lacs
Bengaluru
Work from Office
As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 2 months ago
5.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Proficient with Azure Platform Development (Azure Functions, Azure Services etc) . * 5 to 15 Years relevant software development experience with fairly Full stack profile Proficient in Cloud Native Deployment with CI/CD Pipelines. Proficient in one or more of Data Development (SQL Databases, No SQL, Cloud Datastores etc) technologies Good and Effective Communication skill to understand the requirement and articulate the solution Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proficient in .Net Core with React or Angular Experience in Agile teams applying the best architectural, design, unit testing patterns & practices with an eye for code quality and standards. AZURE FUNCTION, AZURE SERVICE BUS, AZURE STORAGE ACCOUNT- MANDATORY AZURE DURABLE FUNCTIONS AZURE DATA FACTORY, AZURE SQL OR COSMOS DB(DATABASE)Required Ability to write calculation rules and configurable consolidation rules Preferred technical and professional experience Excellent written and verbal interpersonal skills for coordinating across teams. Should have at least 2 end to end implementation experience. Ability to write and update the rules of historical overrides
Posted 2 months ago
10.0 - 17.0 years
12 - 22 Lacs
Gurugram
Work from Office
We know the importance that food plays in people's lives the power it has to bring people, families and communities together. Our purpose is to bring enjoyment to people’s lives through great tasting food, in a way which reflects our values. McCain has recently committed to implementing regenerative agriculture practices across 100 percent of our potato acreage by 2030. Ask us more about our commitment to sustainability. OVERVIEW McCain is embarking on a digital transformation. As part of this transformation, we are making significant investments into our data platforms, common data models. data structures and data policies to increase the quality of our data, the confidence of our business teams to use this data to make better decisions and drive value through the use of data. We have a new and ambitious global Digital & Data group, which serves as a resource to the business teams in our regions and global functions. We are currently recruiting an experienced Data Architect to build enterprise data model McCain. JOB PURPOSE: Reporting to the Data Architect Lead, Global Data Architect will take a lead role in creating the enterprise data model for McCain Foods, bringing together data assets across agriculture, manufacturing, supply chain and commercial. This data model will be the foundation for our analytics program that seeks to bring together McCain’s industry-leading operational data sets, with 3rd party data sets, to drive world-class analytics. Working with a diverse team of data governance experts, data integration architects, data engineers and our analytics team including data scientist, you will play a key role in creating a conceptual, logical and physical data model that underpins the Global Digital & Data team’s activities. . JOB RESPONSIBILITIES: Develop an understanding of McCain’s key data assets and work with data governance team to document key data sets in our enterprise data catalog Work with business stakeholders to build a conceptual business model by understanding the business end to end process, challenges, and future business plans. Collaborate with application architects to bring in the analytics point of view when designing end user applications. Develop Logical data model based on business model and align with business teams Work with technical teams to build physical data model, data lineage and keep all relevant documentations Develop a process to manage to all models and appropriate controls With a use-case driven approach, enhance and expand enterprise data model based on legacy on-premises analytics products, and new cloud data products including advanced analytics models Design key enterprise conformed dimensions and ensure understanding across data engineering teams (including third parties); keep data catalog and wiki tools current Primary point of contact for new Digital and IT programs, to ensure alignment to enterprise data model Be a clear player in shaping McCain’s cloud migration strategy, enabling advanced analytics and world-leading Business Intelligence analytics Work in close collaboration with data engineers ensuring data modeling best practices are followed MEASURES OF SUCCESS: Demonstrated history of driving change in a large, global organization A true passion for well-structured and well-governed data; you know and can explain to others the real business risk of too many mapping tables You live for a well-designed and well-structured conformed dimension table Focus on use-case driven prioritization; you are comfortable pushing business teams for requirements that connect to business value and also able to challenge requirements that will not achieve the business’ goals Developing data models that are not just elegant, but truly optimized for analytics, both advanced analytics use cases and dashboarding / BI tools A coaching mindset wherever you go, including with the business, data engineers and other architects A infectious enthusiasm for learning: about our business, deepening your technical knowledge and meeting our teams Have a “get things done” attitude. Roll up the sleeves when necessary; work with and through others as needed KEY QUALIFICATION & EXPERIENCES: Data Design and Governance At least 5 years of experience with data modeling to support business process Ability to design complex data models to connect and internal and external data Nice to have: Ability profile the data for data quality requirements At least 8 years of experience with requirement analysis; experience working with business stakeholders on data design Experience on working with real-time data. Nice to have: experience with Data Catalog tools Ability to draft accurate documentation that supports the project management effort and coding Technical skills At least 5 years of experience designing and working in Data Warehouse solutions building data models; preference for having S4 hana knowledge. At least 2 years of experience in visualization tools preferably Power BI or similar tools.e At least 2 years designing and working in Cloud Data Warehouse solutions; preference for Azure Databricks, Azure Synapse or earlier Microsoft solutions Experience Visio, Power Designer, or similar data modeling tools Nice to have: Experience in data profiling tools informatica, Collibra or similar data quality tools Nice to have: Working experience on MDx Experience in working in Azure cloud environment or similar cloud environment Must have : Ability to develop queries in SQL for assessing , manipulating, and accessing data stored in relational databases , hands on experience in PySpark, Python Nice to have: Ability to understand and work with unstructured data Nice to have at least 1 successful enterprise-wide cloud migration being the data architect or data modeler. - mainly focused on building data models. Nice to have: Experience on working with Manufacturing /Digital Manufacturing. Nice to have: experience designing enterprise data models for analytics, specifically in a PowerBI environment Nice to have: experience with machine learning model design (Python preferred) Behaviors and Attitudes Comfortable working with ambiguity and defining a way forward. Experience challenging current ways of working A documented history of successfully driving projects to completion Excellent interpersonal skills Attention to the details. Good interpersonal and communication skills Comfortable leading others through change
Posted 2 months ago
5.0 - 10.0 years
15 - 30 Lacs
Bengaluru
Remote
Greetings from tsworks Technologies India Pvt We are hiring for Sr. Data Engineer / Lead Data Engineer, if you are interested please share your CV to mohan.kumar@tsworks.io About This Role tsworks Technologies India Private Limited is seeking driven and motivated Senior Data Engineers to join its Digital Services Team. You will get hands-on experience with projects employing industry-leading technologies. This would initially be focused on the operational readiness and maintenance of existing applications and would transition into a build and maintenance role in the long run. Position: Senior Data Engineer / Lead Data Engineer Experience : 5 to 11 Years Location : Bangalore, India / Remote Mandatory Required Qualification Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc. Expertise in DevOps and CI/CD implementation Excellent Communication Skills Skills & Knowledge Bachelor's or masters degree in computer science, Engineering, or a related field. 5 to 10 Years of experience in Information Technology, designing, developing and executing solutions. 3+ Years of hands-on experience in designing and executing data solutions on Azure cloud platforms as a Data Engineer. Strong proficiency in Azure services such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Storage, etc. Familiarity with Snowflake data platform is a good to have experience. Hands-on experience in data modelling, batch and real-time pipelines, using Python, Java or JavaScript and experience working with Restful APIs are required. Expertise in DevOps and CI/CD implementation. Hands-on experience with SQL and NoSQL databases. Hands-on experience in data modelling, implementation, and management of OLTP and OLAP systems. Experience with data modelling concepts and practices. Familiarity with data quality, governance, and security best practices. Knowledge of big data technologies such as Hadoop, Spark, or Kafka. Familiarity with machine learning concepts and integration of ML pipelines into data workflows Hands-on experience working in an Agile setting. Is self-driven, naturally curious, and able to adapt to a fast-paced work environment. Can articulate, create, and maintain technical and non-technical documentation. Public cloud certifications are desired.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |