Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
25 - 30 Lacs
Noida
Work from Office
Position Overview We are looking for an experienced Lead Data Engineer to join our dynamic team. If you are passionate about building scalable software solutions, and work collaboratively with cross-functional teams to define requirements and deliver solutions we would love to hear from you. Job Responsibilities: Develop and maintain data pipelines and ETL/ELT processes using Python Design and implement scalable, high-performance applications Work collaboratively with cross-functional teams to define requirements and deliver solutions Develop and manage near real-time data streaming solutions using Pub, Sub or Beam. Contribute to code reviews, architecture discussions, and continuous improvement initiatives Monitor and troubleshoot production systems to ensure reliability and performance Basic Qualifications: 5+ years of professional software development experience with Python Strong understanding of software engineering best practices (testing, version control, CI/CD) Experience building and optimizing ETL/ELT processes and data pipelines Proficiency with SQL and database concepts Experience with data processing frameworks (e.g., Pandas) Understanding of software design patterns and architectural principles Ability to write clean, well-documented, and maintainable code Experience with unit testing and test automation Experience working with any cloud provider (GCP is preferred) Experience with CI/CD pipelines and Infrastructure as code Experience with Containerization technologies like Docker or Kubernetes Bachelors degree in Computer Science, Engineering, or related field (or equivalent experience) Proven track record of delivering complex software projects Excellent problem-solving and analytical thinking skills Strong communication skills and ability to work in a collaborative environment Preferred Qualifications: Experience with GCP services, particularly Cloud Run and Dataflow Experience with stream processing technologies (Pub/Sub) Familiarity with big data technologies (Airflow) Experience with data visualization tools and libraries Knowledge of CI/CD pipelines with Gitlab and infrastructure as code with Terraform Familiarity with platforms like Snowflake, Bigquery or Databricks,. GCP Data engineer certification
Posted 1 week ago
5.0 - 10.0 years
25 - 30 Lacs
Noida
Work from Office
Position Overview We are looking for an experienced Lead Data Engineer to join our dynamic team. If you are passionate about building scalable software solutions, and work collaboratively with cross-functional teams to define requirements and deliver solutions we would love to hear from you. Job Responsibilities: Develop and maintain data pipelines and ETL/ELT processes using Python Design and implement scalable, high-performance applications Work collaboratively with cross-functional teams to define requirements and deliver solutions Develop and manage near real-time data streaming solutions using Pub, Sub or Beam. Contribute to code reviews, architecture discussions, and continuous improvement initiatives Monitor and troubleshoot production systems to ensure reliability and performance Basic Qualifications: 5+ years of professional software development experience with Python Strong understanding of software engineering best practices (testing, version control, CI/CD) Experience building and optimizing ETL/ELT processes and data pipelines Proficiency with SQL and database concepts Experience with data processing frameworks (e.g., Pandas) Understanding of software design patterns and architectural principles Ability to write clean, well-documented, and maintainable code Experience with unit testing and test automation Experience working with any cloud provider (GCP is preferred) Experience with CI/CD pipelines and Infrastructure as code Experience with Containerization technologies like Docker or Kubernetes Bachelors degree in Computer Science, Engineering, or related field (or equivalent experience) Proven track record of delivering complex software projects Excellent problem-solving and analytical thinking skills Strong communication skills and ability to work in a collaborative environment Preferred Qualifications: Experience with GCP services, particularly Cloud Run and Dataflow Experience with stream processing technologies (Pub/Sub) Familiarity with big data technologies (Airflow) Experience with data visualization tools and libraries Knowledge of CI/CD pipelines with Gitlab and infrastructure as code with Terraform Familiarity with platforms like Snowflake, Bigquery or Databricks,. GCP Data engineer certification
Posted 1 week ago
3.0 - 8.0 years
13 - 17 Lacs
Pune
Work from Office
Alexa+ is our next-generation assistant powered by generative AI. Alexa+ is more conversational, smarter, personalized, and gets things done. Our goal is make Alexa+ an instantly familiar personal assistant that is always ready to help or entertain on any device. At the core of this vision is Alexa AI Developer Tech, a close-knit team that s dedicated to providing software developers with the tools, primitives, and services they need to easily create engaging customer experiences that expand the wealth of information, products and services available on Alexa+. You will join a growing organization working on top technology using Generative AI and have an enormous opportunity to make an impact on the design, architecture, and implementation of products used every day, by people you know. We re working hard, having fun, and making history; come join us! Work with a team of product and program managers, engineering leaders, and business leaders to build data architectures and platforms to support business Design, develop, and operate high-scalable, high-performance, low-cost, and accurate data pipelines in distributed data processing platforms Recognize and adopt best practices in data processing, reporting, and analysis: data integrity, test design, analysis, validation, and documentation Keep up to date with big data technologies, evaluate and make decisions around the use of new or existing software products to design the data architecture Design, build and own all the components of a high-volume data warehouse end to end. Provide end-to-end data engineering support for project lifecycle execution (design, execution and risk assessment) Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Interface with other technology teams to extract, transform, and load (ETL) data from a wide variety of data sources Own the functional and nonfunctional scaling of software systems in your ownership area. *Implement big data solutions for distributed computing. About the team Alexa AI Developer Tech is an organization within Alexa on a mission to empower developers to create delightful and engaging experiences by making Alexa more natural, accurate, conversational, and personalized. 3+ years of data engineering experience 4+ years of SQL experience Experience with data modeling, warehousing and building ETL pipelines Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Pune
Work from Office
AWS Developer Experience: 5-7 years Skills: Python, SQL, PySpark (Python + Spark), AWS (S3, Lambda, EC2) About the Role: We are looking for a skilled AWS Developer who specializes in building and maintaining scalable applications on the AWS cloud platform. The ideal candidate should have strong development experience with Python and PySpark, deep understanding of AWS services, and the ability to design data-driven solutions that are efficient, secure, and highly available. Key Responsibilities: Develop, deploy, and maintain applications and services on AWS using Lambda, EC2, S3, RDS, and other AWS components. Write clean, efficient Python and PySpark code for backend services, data processing pipelines, and microservices. Design and optimize SQL queries for data manipulation and reporting purposes. Implement serverless solutions using AWS Lambda and API Gateway for event-driven architectures. Build and manage large-scale data pipelines leveraging AWS services such as S3, Glue, EMR, and Athena. Collaborate with DevOps teams to implement CI/CD pipelines for AWS deployments using tools like CodePipeline, CloudFormation, and Terraform. Monitor, troubleshoot, and optimize AWS applications for cost, performance, and scalability. Ensure the security and compliance of AWS-based applications according to best practices. Work with cross-functional teams (data engineers, architects, product owners) to translate requirements into technical solutions. Maintain thorough documentation of application architecture, design, and operational procedures. Required Skills Experience: 5-7 years of experience in backend development with a focus on cloud-native applications. Strong programming skills in Python, including data processing libraries (Pandas, PySpark). Solid experience with AWS services: Lambda, S3, EC2, RDS, API Gateway, CloudWatch, IAM. Experience developing and optimizing SQL queries for large datasets. Hands-on experience with building event-driven and serverless architectures. Understanding of networking concepts like VPC, Subnets, Security Groups, and Load Balancers in AWS. Familiarity with source control systems (Git) and Agile development methodologies. Strong problem-solving skills with a focus on performance and scalability. Preferred Qualifications: AWS Certified Developer - Associate or AWS Certified Solutions Architect - Associate is highly desirable. Experience with infrastructure as code (IaC) tools like CloudFormation or Terraform. Familiarity with container technologies like Docker and orchestration services like ECS or EKS. Knowledge of best practices for securing cloud applications.
Posted 1 week ago
9.0 - 14.0 years
12 - 14 Lacs
Bengaluru
Work from Office
Job Title: Google Cloud Platform (GCP) Architect - Vertex AI Generative AIResponsibilities:Design, implement, and maintain highly scalable cloud-based solutions using GCP services, with a focus on Vertex AI for building and deploying AI models Create and maintain architectures leveraging GCP products such as BigQuery, Kubernetes Engine (GKE), Cloud Functions, Pub/Sub, and others Work with teams to define strategies for implementing machine learning (ML) workflows, optimizing model performance, and managing data pipelines Utilize Vertex AI to build end-to-end ML workflows, from data collection and preprocessing to model training, deployment, and monitoring Leverage Generative AI models and techniques to deliver cutting-edge AI solutions, particularly in content generation, data augmentation, or natural language processing Experience on Agentic AI FrameworksImplement best practices in deploying and scaling machine learning models in the cloud Partner with cross-functional teams (data scientists, engineers, product managers) to understand business requirements and deliver technical solutions Drive continuous improvement and innovation in cloud-native AI solutions Skills Experience: 10+ Years of IT experienceRequired:Deep Knowledge of GCP Ecosystem:Strong proficiency with GCP services including but not limited to:Vertex AI for end-to-end AI/ML lifecycle management BigQuery for large-scale data processing Google Kubernetes Engine (GKE) for container orchestration Cloud Functions, Cloud Pub/Sub, Cloud Storage, and others Experience designing, deploying, and optimizing GCP-based architectures in production environments Expertise in Vertex AI:Experience using Vertex AI for building, training, and deploying machine learning models at scale Proficiency with Vertex AI Workbench, Pipelines, and Model Monitoring Familiarity with AutoML and custom model training on Vertex AI Experience with Generative AI:Hands-on experience with Generative AI techniques and models, such as GPT (Generative Pre-trained Transformer), GANs (Generative Adversarial Networks), or other advanced natural language models Familiarity with applying Generative AI in real-world scenarios, such as content generation, AI-driven chatbots, synthetic data generation, or AI-enhanced user experiences Experience with Agentic AI:Experience of Agentic AI Framework (e g Google Agentspace)Strong Programming and Scripting Skills:Proficiency in Python (required for machine learning tasks), along with experience in ML libraries like TensorFlow, PyTorch, or Scikit-learn Familiarity with SQL for querying data in BigQuery Cloud Security and Governance:Strong understanding of cloud security practices (IAM roles, service accounts, encryption) Experience with GCP security tools (e g , Cloud Identity, Security Command Center) Certifications:Google Cloud Professional Cloud Architect or Google Cloud Professional Data Engineer certification Vertex AI or AI/ML-related certifications Education:Masters degree/PhD in Computer Science Data Science/AI ML or a related technical field or equivalent practical experience
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing innovative solutions to enhance business operations and efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient PySpark applications.- Collaborate with team members to analyze and address application requirements.- Troubleshoot and resolve technical issues in application development.- Stay updated with the latest trends and technologies in PySpark.- Provide technical guidance and support to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data processing and manipulation using PySpark.- Experience in building scalable and efficient PySpark applications.- Knowledge of PySpark libraries and frameworks.- Good To Have Skills: Experience with data visualization tools. Additional Information:- The candidate should have a minimum of 3 years of experience in PySpark.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will play a crucial role in developing solutions that align with organizational goals and enhance operational efficiency. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to understand project requirements and deliver high-quality solutions.- Develop and maintain applications using Microsoft Azure Databricks.- Troubleshoot and debug applications to ensure optimal performance.- Implement best practices for application development and deployment.- Stay updated with the latest technologies and trends in application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Strong understanding of cloud computing principles and services.- Experience with data processing and analytics using Azure services.- Knowledge of programming languages such as Python, Scala, or SQL.- Hands-on experience in building and deploying applications on Azure cloud platform. Additional Information:- The candidate should have a minimum of 3 years of experience in Microsoft Azure Databricks.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Python (Programming Language), Apache AirflowMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions that align with business needs and requirements, ensuring efficient application performance and functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Expected to provide solutions to problems that apply across multiple teams- Lead the application development process- Implement best practices for application design and development- Conduct code reviews and ensure code quality standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform- Good To Have Skills: Experience with Python (Programming Language), Apache Airflow- Strong understanding of data analytics and data processing- Experience in building and optimizing data pipelines- Knowledge of cloud platforms and services for data processing Additional Information:- The candidate should have a minimum of 12 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Engineering Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement efficient data pipelines for data processing.- Optimize data storage and retrieval processes to enhance performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Engineering.- Strong understanding of ETL processes and data modeling.- Experience with cloud platforms such as AWS or Azure.- Knowledge of programming languages like Python or Java. Additional Information:- The candidate should have a minimum of 3 years of experience in Data Engineering and flink- The candidate must have Flink knowledge.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
9.0 - 10.0 years
14 - 15 Lacs
Pune
Work from Office
HSBC electronic data processing india pvt ltd is looking for DataStage /Senior Consultant Specialist to join our dynamic team and embark on a rewarding career journey Advises clients on complex business and technology challenges Leads solution design, implementation, and delivery Provides subject-matter expertise across multiple domains Drives strategic initiatives and stakeholder alignment
Posted 1 week ago
6.0 - 11.0 years
8 - 12 Lacs
Hyderabad
Work from Office
As a Senior Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products! You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results
Posted 1 week ago
2.0 - 5.0 years
6 - 10 Lacs
Gurugram
Work from Office
Join us as a Software Engineer This is an opportunity for a driven Software Engineer to take on an exciting new career challenge Day-to-day, youll be engineering and maintaining innovative, customer centric, high performance, secure and robust solutions It s a chance to hone your existing technical skills and advance your career while building a wide network of stakeholders Were offering this role at associate level What youll do In your new role, you ll be working within a feature team to engineer software, scripts and tools, as well as liaising with other engineers, architects and business analysts across the platform. You ll also be: Producing complex and critical software rapidly and of high quality which adds value to the business Working in permanent teams who are responsible for the full life cycle, from initial development, through enhancement and maintenance to replacement or decommissioning Collaborating to optimise our software engineering capability Designing, producing, testing and implementing our working software solutions Working across the life cycle, from requirements analysis and design, through coding to testing, deployment and operations The skills youll need To take on this role, you ll need a background in software engineering, software design, and architecture, and an understanding of how your area of expertise supports our customers. You ll also need: Experience of working with development and testing tools, bug tracking tools and wikis At least five years experience in designing and developing SQL queries, views, and stored procedures Exposure in building and maintaining ETL pipelines for data processing and ensure data quality, consistency, and performance. Experience in SQL Server, SSIS and TeamCity, GitHub for version control and deployment. Experience of DevOps and Agile methodology and associated toolsets
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Ahmedabad
Work from Office
Job Summary: We are seeking a Senior Data Engineer with hands-on experience building scalable data pipelines using Microsoft Fabric. The role focuses on delivering ingestion, transformation, and enrichment workflows across medallion architecture. Key Responsibilities: Develop and maintain data pipelines using Microsoft Fabric Data Factory and OneLake. Design and build ingestion and transformation pipelines for structured and unstructured data. Implement frameworks for metadata tagging, version control, and batch tracking. Ensure security, quality, and compliance of data pipelines. Contribute to CI/CD integration, observability, and documentation. Collaborate with data architects and analysts to meet business requirements. Qualifications: 6+ years of experience in data engineering; 2+ years working on Microsoft Fabric or Azure Data services. Hands-on with tools like Azure Data Factory, Fabric, Databricks, or Synapse. Strong SQL and data processing skills (e.g., PySpark, Python). Experience with data cataloging, lineage, and governance frameworks.
Posted 1 week ago
12.0 - 18.0 years
40 - 45 Lacs
Bengaluru
Work from Office
Not Applicable Specialism Data, Analytics & AI Management Level Director & Summary . & Summary We are seeking an experienced Senior Data Architect to lead the design and development of our data architecture, leveraging cloudbased technologies, big data processing frameworks, and DevOps practices. The ideal candidate will have a strong background in data warehousing, data pipelines, performance optimization, and collaboration with DevOps teams. Responsibilities 1. Design and implement endtoend data pipelines using cloudbased services (AWS/ GCP/Azure) and conventional data processing frameworks. 2. Lead the development of data architecture, ensuring scalability, security, and performance. 3. Collaborate with crossfunctional teams, including DevOps, to design and implement data lakes, data warehouses, and data ingestion/extraction processes. 4. Develop and optimize data processing workflows using PySpark, Kafka, and other big data processing frameworks. 5. Ensure data quality, integrity, and security across all data pipelines and architectures. 6. Provide technical leadership and guidance to junior team members. 7. Design and implement data load strategies, data partitioning, and data storage solutions. 8. Collaborate with stakeholders to understand business requirements and develop data solutions to meet those needs. 9. Work closely with DevOps team to ensure seamless integration of data pipelines with overall system architecture. 10. Participate in design and implementation of CI/CD pipelines for data workflows. DevOps Requirements 1. Knowledge of DevOps practices and tools, such as Jenkins, GitLab CI/CD, or Apache Airflow. 2. Experience with containerization using Docker. 3. Understanding of infrastructure as code (IaC) concepts using tools like Terraform or AWS CloudFormation. 4. Familiarity with monitoring and logging tools, such as Prometheus, Grafana, or ELK Stack. Requirements 1. 1214 years of experience for Senior Data Architect in data architecture, data warehousing, and big data processing. 2. Strong expertise in cloudbased technologies (AWS/ GCP/ Azure) and data processing frameworks (PySpark, Kafka, Flink , Beam etc.). 3. Experience with data ingestion, data extraction, data warehousing, and data lakes. 4. Strong understanding of performance optimization, data partitioning, and data storage solutions. 5. Excellent leadership and communication skills. 6. Experience with NoSQL databases is a plus. Mandatory skill sets 1. Experience with agile development methodologies. 2. Certification in cloudbased technologies (AWS / GCP/ Azure) or data processing frameworks. 3. Experience with data governance, data quality, and data security. Preferred skill sets Knowledge of AgenticAI and GenAI is added advantage Years of experience required 12 to 18 years Education qualification Graduate Engineer or Management Graduate Education Degrees/Field of Study required Bachelor of Engineering Degrees/Field of Study preferred Required Skills AWS Devops Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Coaching and Feedback, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Influence, Innovation, Intellectual Curiosity, Learning Agility {+ 28 more} Travel Requirements Government Clearance Required?
Posted 1 week ago
3.0 - 5.0 years
6 - 10 Lacs
Chennai
Work from Office
J ob Title: Java Developer IoT Applications Location: Chennai, India Experience: 3 5 Years Employment Type: Full-Time Job Summary: We are seeking a highly motivated Java Developer with hands-on experience in Internet of Things (IoT) solutions. The ideal candidate will have a strong background in Java-based backend development and a working knowledge of IoT protocols, device communication, and cloud integration. You will collaborate with cross-functional teams to build scalable and secure IoT platforms. Required Skills: Strong programming skills in Java / Spring Boot / REST APIs . Experience with IoT protocols (e.g., MQTT, CoAP, WebSockets). Knowledge of device communication , edge computing , or sensor data processing . Experience with cloud platforms (e.g., AWS IoT, Azure IoT Hub, Google Cloud IoT). Familiarity with NoSQL or time-series databases (e.g., MongoDB, InfluxDB). Understanding of security in IoT systems (authentication, encryption). Working knowledge of Linux environments and Docker is a plus.
Posted 1 week ago
4.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Provides high quality, on-time input to client projects in the life sciences field. Assignments range in complexity from basic analysis and problem solving to assisting in the development of more complex solutions. May serve as project leader for small teams or work streams. Essential Functions Develop online survey using effective survey programming tools, viz. Decipher, Confirmit, Sawtooth etc. Assist in complex custom scripts using jQuery/ JavaScript Assists with the review and analysis of client requirements or problems and assists in the development of proposals and client solutions. Assists in the development of detailed documentation and specifications. Performs quantitative or qualitative analyses to assist in the identification of client issues and the development of client specific solutions. Assists in the design/structure and completion of presentations that are appropriate to the characteristics or needs of the audience. Develops, and may present, complete client deliverables within known/identified frameworks and methodologies. Proactively develops a basic knowledge of consulting methodologies and the life sciences market through the delivery of consulting engagements and participation in formal and informal learning opportunities. Engagement based responsibilities are assigned and managed by Senior Consultants, Engagement Managers or Principals. Strong analytical and problem-solving skills with experience in data interpretation Ability to work in a fast-paced environment and manage multiple projects simultaneously Qualifications Bachelors Degree required 4-8 years of related experience required Works willingly and effectively with others in and across the organization to accomplish team goals. Knowledge of data processing/ analysis tools, viz. SPSS, Wincross is a good to have skill. Knowledge and understanding of the fundamental processes of business, their interaction, and the impact of external/internal influences on decision making, growth and decline. Knowledge of consulting methods, tools and techniques, related to one s functional area. Knowledge of current events and developments within an industry and major competitors. Effective time & team management skills.
Posted 1 week ago
3.0 - 8.0 years
6 - 11 Lacs
Pune
Work from Office
Senior Executive - Data Management - - - - - - - - - - - - Product data management - Verifies the accuracy and consistency of the data collected before encoding it in the information systems (Group Referential for Articles, ...). - Analyzes the impact on other processes (sales forecasts, product deployment, marketing, ...) and coordinates actions if necessary. - Ensures that data providers (Marketing, Technical Department, etc.) respect data processing procedures and report cases of incorrect data. - Enters/updates data (item codes, considering homologations, managing assimilation, ...), ensuring that they comply and that the appropriate level of quality is maintained. - Cleans up obsolete, unused, incorrect, or duplicate data. - Optimizes the data of its perimeter to limit the number of references present in the Supply Chain processes (impact on stocks). - Send the information necessary for the proper management of stocks in factory warehouses (storage instructions according to item codes). - Identifies rejects (notably BESHEB and AL rejects) and manages the corrections in the information systems. Quality of product data - Controls data quality according to the defined framework (standards, instructions, quality rules, ...). - Identifies the needs for the evolution of quality controls to improve the detection of non-quality. Problem solving - Analyzes operational problems and incidents. - Manages resolutions and corrective actions. Methods of work - Applies the methods and practices necessary to manage the complete life cycle of product data from its creation to its complete cessation. - Contributes to continuous improvement (processes, tools, organization, operating procedures, ...) to improve data management and quality (Progress Plan). Technical Expertise SQL Basic MS EXCEL Advanced Level MC Access Beginner/Advanced Level (Good to have) Power BI (Good to have) Recommended educational qualifications - Graduate or post-graduate Recommended years of experience - 3 + years
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams to ensure seamless integration and functionality of applications. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to ensure application functionality.- Conduct testing and debugging of applications to ensure optimal performance.- Provide technical support and troubleshooting for application issues.- Stay updated on industry trends and technologies to enhance application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data analytics and data processing techniques.- Experience with cloud-based data platforms and services.- Knowledge of programming languages such as Python, Scala, or SQL.- Hands-on experience in designing and developing scalable applications. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
7 - 11 Lacs
Pune
Work from Office
Senior Executive - Data Management Product data management - Verifies the accuracy and consistency of the data collected before encoding it in the information systems (Group Referential for Articles, ...). - Analyzes the impact on other processes (sales forecasts, product deployment, marketing, ...) and coordinates actions if necessary. - Ensures that data providers (Marketing, Technical Department, etc.) respect data processing procedures and report cases of incorrect data. - Enters/updates data (item codes, considering homologations, managing assimilation, ...), ensuring that they comply and that the appropriate level of quality is maintained. - Cleans up obsolete, unused, incorrect, or duplicate data. - Optimizes the data of its perimeter to limit the number of references present in the Supply Chain processes (impact on stocks). - Send the information necessary for the proper management of stocks in factory warehouses (storage instructions according to item codes). - Identifies rejects (notably BESHEB and AL rejects) and manages the corrections in the information systems. Quality of product data - Controls data quality according to the defined framework (standards, instructions, quality rules, ...). - Identifies the needs for the evolution of quality controls to improve the detection of non-quality. Problem solving - Analyzes operational problems and incidents. - Manages resolutions and corrective actions. Methods of work - Applies the methods and practices necessary to manage the complete life cycle of product data from its creation to its complete cessation. - Contributes to continuous improvement (processes, tools, organization, operating procedures, ...) to improve data management and quality (Progress Plan). Technical Expertise SQL Basic MS EXCEL Advanced Level MC Access Beginner/Advanced Level (Good to have) Power BI (Good to have) Recommended educational qualifications - Graduate or post-graduate Recommended years of experience - 3 + years
Posted 1 week ago
4.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Overview Provides high quality, on-time input to client projects in the life sciences field. Assignments range in complexity from basic analysis and problem solving to assisting in the development of more complex solutions. May serve as project leader for small teams or work streams. Essential Functions Develop online survey using effective survey programming tools, viz. Decipher, Confirmit, Sawtooth etc. Assist in complex custom scripts using jQuery/ JavaScript Assists with the review and analysis of client requirements or problems and assists in the development of proposals and client solutions. Assists in the development of detailed documentation and specifications. Performs quantitative or qualitative analyses to assist in the identification of client issues and the development of client specific solutions. Assists in the design/structure and completion of presentations that are appropriate to the characteristics or needs of the audience. Develops, and may present, complete client deliverables within known/identified frameworks and methodologies. Proactively develops a basic knowledge of consulting methodologies and the life sciences market through the delivery of consulting engagements and participation in formal and informal learning opportunities. Engagement based responsibilities are assigned and managed by Senior Consultants, Engagement Managers or Principals. Strong analytical and problem-solving skills with experience in data interpretation Ability to work in a fast-paced environment and manage multiple projects simultaneously Qualifications Bachelors Degree required 4-8 years of related experience required Works willingly and effectively with others in and across the organization to accomplish team goals. Knowledge of data processing/ analysis tools, viz. SPSS, Wincross is a good to have skill. Knowledge and understanding of the fundamental processes of business, their interaction, and the impact of external/internal influences on decision making, growth and decline. Knowledge of consulting methods, tools and techniques, related to one s functional area. Knowledge of current events and developments within an industry and major competitors. Effective time & team management skills. . We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide . Learn more at https://jobs.iqvia.com
Posted 1 week ago
3.0 - 5.0 years
13 - 14 Lacs
Chennai
Work from Office
A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. What will you do A Conversion Professional is responsible for timely and accurate conversion of new and existing Bank/Client data to Fiserv systems, from both internal and external sources. This role is responsible for providing data analysis for client projects and to accommodate other ad hoc data updates to meet client requests. As part of the overall Service Delivery organization, a Conversion Professional plays a critical role in mapping in data to support project initiatives for new and existing banks. Working with financial services data means a high priority on accuracy and adherence to procedures and guidelines. The person stepping in as the backup would need to review the specifications history and then review and understand the code that was being developed to resolve the issue and or change. This would also have to occur on the switch back to the original developer. Today, the associate handling the project would log back in to support the effort and address the issue and or change. What you will need to have Bachelor s degree in programming or related field Minimum 3 years relevant experience in data processing (ETL) conversions or financial services industry 3 5 years Experience and strong knowledge of MS SQL/PSQL, MS SSIS and data warehousing concepts Strong communication skills and ability to provide technical information to non-technical colleagues. Team players with ability to work independently. Experience in full software development life cycle using agile methodologies. Should have good understanding of Agile methodologies and can handle agile ceremonies. Efficient in Reviewing, coding, testing, and debugging of application/Bank programs. Should be able to work under pressure while resolving critical issues in Prod environment. Good communication skills and experience in working with Clients. Good understanding in Banking Domain. What would be great to have Experience with Informatica, Power BI, MS Visual Basic, Microsoft Access and Microsoft Excel required. Experience with Card Management systems, debit card processing is a plus Strong communication skills and ability to provide technical information to non-technical colleagues Ability to manage and prioritize work queue across multiple workstreams Team player with ability to work independently Highest attention to detail and accuracy Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address. Share this Job Email LinkedIn X Facebook
Posted 1 week ago
7.0 - 12.0 years
5 - 9 Lacs
Pune
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Elastic Stack (ELK) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with team members, analyzing requirements, and developing solutions to meet business needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing new technologies- Conduct code reviews and ensure coding standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Elastic Stack (ELK)- Strong understanding of data analytics and visualization- Experience with data processing and transformation- Knowledge of cloud technologies and deployment- Hands-on experience in application development using Elastic Stack (ELK) Additional Information:- The candidate should have a minimum of 7.5 years of experience in Elastic Stack (ELK)- This position is based at our Pune office- A 15 years full-time education is required Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : IBM WebSphere DataPower Good to have skills : Product and Market StrategyMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with team members to ensure successful project delivery and application functionality. Roles & Responsibilities:-Expertise in XSLT or GatewayScript:Proficient in using XSLT for transforming and processing data.-REST and SOAP Web Services:Extensive experience in developing and managing REST-based and SOAP-based web services using IBM DataPower.-Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol -Gateways (MPG), and XML Firewalls.-XML and Related Technologies:Strong knowledge of XML, WSDL, XSLT, JSON, XML Schema, and XPATH. Professional & Technical Skills: - Must To Have Skills: Expertise in XSLT or GatewayScript:Proficient in using GatewayScript for transforming and processing data.- Good To Have Skills: Strong understanding of REST and SOAP Web Services:Extensive experience in developing and managing REST-based and SOAP-based web services using IBM DataPower.- Familiarity with Code Migration and Implementation:Skilled in migrating and implementing code on DataPower appliances.- Strong knowledge of JSON & schema-Solution Development:Proven ability to develop solutions using Web-Service Proxies, Multi-Protocol Gateways (MPG) Additional Information:- The candidate should have a minimum of 3 years of experience in IBM WebSphere DataPower.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 1 week ago
3.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Hadoop Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project needs, developing application features, and ensuring that the applications function smoothly and efficiently. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking ways to improve processes and deliver high-quality solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure best practices and quality standards are met. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Hadoop.- Strong understanding of distributed computing principles and frameworks.- Experience with data processing and analysis using Hadoop ecosystem tools.- Familiarity with programming languages such as Java or Python.- Knowledge of data storage solutions and data management best practices. Additional Information:- The candidate should have minimum 3 years of experience in Apache Hadoop.- This position is based at our Chennai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
15.0 - 20.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : Java Enterprise EditionMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project specifications, developing application features, and ensuring that the applications are aligned with business needs. You will also engage in problem-solving discussions and contribute to the overall success of the project by implementing effective solutions and enhancements. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with Java Enterprise Edition.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services. Additional Information:- The candidate should have minimum 5 years of experience in Apache Spark.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
19947 Jobs | Dublin
Wipro
9475 Jobs | Bengaluru
EY
7894 Jobs | London
Accenture in India
6317 Jobs | Dublin 2
Amazon
6141 Jobs | Seattle,WA
Uplers
6077 Jobs | Ahmedabad
Oracle
5820 Jobs | Redwood City
IBM
5736 Jobs | Armonk
Tata Consultancy Services
3644 Jobs | Thane
Capgemini
3598 Jobs | Paris,France