Jobs
Interviews

3330 Big Data Jobs - Page 43

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 3.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys Delivery team, you will work on implementing designs, developing high quality programs and systems, partnering with our clients to ensure high quality deliverables. You will create technical artifacts and be the first point of contact in responding to production issues and conducting any technical analysis to arrive at solutions You will share your learnings from projects through knowledge management initiatives and leverage knowledge from other projects to drive high efficiency and effectiveness You will be a key contributor to building efficient programs/systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Advanced conceptual understanding of at least one Programming Language Advanced conceptual understanding of one database and one Operating System Understanding of Software Engineering with practice in at least one project Ability to contribute in medium to complex tasks independently Exposure to Design Principles and ability to understand Design Specifications independently Ability to run Test Cases and scenarios as per the plan Ability to accept and respond to production issues and coordinate with stake holders Good understanding of SDLC Analytical abilities Logical thinking Awareness of latest technologies and trends Technical and Professional : Primary skills:Bigdata-Scala,Bigdata-Spark,Technology-Java-Play Framework,Technology-Reactive Programming-Akka Preferred Skills: Bigdata-Spark Bigdata-Scala Technology-Reactive Programming-Akka Technology-Java-Play Framework

Posted 1 month ago

Apply

3.0 - 5.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Logical thinking and problem-solving skills along with an ability to collaborate Technical and Professional : Technology-Cloud Platform-GCP Database-Google BigQuery Preferred Skills: Technology-Cloud Platform-Google Big Data-GCP Technology-Cloud Platform-GCP Core Services-GCP Technology-Cloud Platform-GCP Data Analytics Technology-Cloud Platform-GCP Database

Posted 1 month ago

Apply

2.0 - 3.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education At least 5 years of experience in Pyspark, Spark with Hadoop distributed frameworks while handling large amount of big data using Spark and Hadoop Ecosystems in Data Pipeline creation , deployment , Maintenance and debugging Experience in scheduling and monitoring Jobs and creating tools for automation At least 4 years of experience with Scala and Python required. Proficient knowledge of SQL with any RDBMS. Strong communication skills (verbal and written) with ability to communicate across teams, internal and external at all levels. Ability to work within deadlines and effectively prioritize and execute on tasks. Preferred QualificationsAt least 1 years of AWS development experience is preferred Experience in Drive automations DevOps Knowledge is an added advantage. Additional Responsibilities: Advanced conceptual understanding of at least one Programming Language Advanced conceptual understanding of one database and one Operating System Understanding of Software Engineering with practice in at least one project Ability to contribute in medium to complex tasks independently Exposure to Design Principles and ability to understand Design Specifications independently Ability to run Test Cases and scenarios as per the plan Ability to accept and respond to production issues and coordinate with stake holders Good understanding of SDLC Analytical abilities Logical thinking Awareness of latest technologies and trends Technical and Professional : Primary Skills Pyspark, Spark and proficient in SQL Secondary Skills Scala and Python Experience 3 + Yrs Preferred Skills: Bigdata-Spark Bigdata-Pyspark Bigdata-Python

Posted 1 month ago

Apply

2.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional : Primary skills:Technology-Cloud Platform-Amazon Webservices DevOps,Technology-terraform, kubernetes, python Preferred Skills: Devops Python AWS DevOps Bigdata Technology-Infrastructure-Transformation-Cloud enabled Infrastructure-Terraform Technology-Container Platform-Kubernetes

Posted 1 month ago

Apply

5.0 - 9.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional : Primary skills:Domain-Retail-Retail Supply Chain & Distribution-Food & Beverages,Technology-Cloud Platform-Google Big Data,Technology-Cloud Platform-Google Cloud - Architecture,Technology-Functional Testing-Mainframe testing-Proterm Preferred Skills: Technology-Cloud Platform-Google Cloud - Architecture-GCP Technology-Cloud Platform-Google Big Data-GCP Technology-Cloud Platform-GCP Data Analytics Technology-Cloud Platform-GCP Database

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities - Managing large machine learning applications and designing and implementing new frameworks to build scalable and efficient data processing workflows and machine learning pipelines.- Build the tightly integrated pipeline that optimizes and compiles models and then orchestrates their execution.- Collaborate with CPU, GPU, and Neural Engine hardware backends to push inference performance and efficiency- Work closely with feature teams to facilitate and debug the integration of increasingly sophisticated models, including large language models- Automate data processing and extraction- Engage with sales team to find opportunities, understand requirements, and translate those requirements into technical solutions.- Develop reusable ML models and assets into production. Technical and Professional : - Excellent Python programming and debugging skills. (Refer to Pytho JD given below)- Proficiency with SQL, relational databases, & non-relational databases- Passion for API design and software architecture.- Strong communication skills and the ability to naturally explain difficult technical topics to everyone from data scientists to engineers to business partners- Experience with modern neural-network architectures and deep learning libraries (Keras, TensorFlow, PyTorch). - Experience unsupervised ML algorithms. - Experience in Timeseries models and Anomaly detection problems.- Experience with modern large language model (Chat GPT/BERT) and applications.- Expertise with performance optimization.- Experience or knowledge in public cloud AWS services - S3, Lambda.- Familiarity with distributed databases, such as Snowflake, Oracle.- Experience with containerization and orchestration technologies, such as Docker and Kubernetes. Preferred Skills: Technology-Big Data - Data Processing-Spark Technology-Machine Learning-R Technology-Machine Learning-Python

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Bigdata Spark, scala, hive, kafka Preferred Skills: Technology-Big Data-Hbase Technology-Big Data-Sqoop Technology-Java-Apache-Scala Technology-Functional Programming-Scala Technology-Big Data - Data Processing-Map Reduce Technology-Big Data - Data Processing-Spark

Posted 1 month ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Python+ Lamda+ AWS+Pyspark Preferred Skills: Technology-Big Data - Data Processing-Spark Technology-Machine Learning-Python Technology-Infrastructure Security-Reverse Malware Engineering-PANDA

Posted 1 month ago

Apply

5.0 - 8.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Bigdata Spark, scala, hive, kafka Preferred Skills: Technology-Big Data-Big Data - ALL Technology-Big Data - Hadoop-Hadoop Technology-Big Data-Hbase Technology-Big Data-Sqoop Technology-Java-Apache-Scala Technology-Functional Programming-Scala Technology-IOT Platform-Custom IOT Platform – Big Data Processing Analytics

Posted 1 month ago

Apply

11.0 - 15.0 years

16 - 20 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Cloud Architecture & Design, Cloud Optimization & Automation, Innovation & Thought Leadership, Extensive experience with AWS, Azure, or GCP cloud platforms,Deep understanding of cloud computing concepts, including IaaS, PaaS, and SaaS, Strong experience with infrastructure as code (IaC) and DevOps practices,Experience with containerization and orchestration (Docker, Kubernetes), Strong knowledge of cloud security best practices and compliance standards, Industry certifications Preferred Skills: Technology-Cloud Platform-Cloud Platform - ALL Technology-Container Platform-Docker Technology-Container Platform-Kubernetes Technology-Cloud Platform-Google Cloud - Architecture

Posted 1 month ago

Apply

2.0 - 3.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Technical and Professional : Primary skills:Bigdata-Scala,Bigdata-Spark,Technology-Java-Play Framework,Technology-Reactive Programming-Akka Preferred Skills: Bigdata-Spark Bigdata-Scala Technology-Reactive Programming-Akka Technology-Java-Play Framework

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Functional Programming-ScalaTechnology-Java-Apache-Scala Preferred Skills: Technology-Java-Apache-Scala Technology-Functional Programming-Scala

Posted 1 month ago

Apply

2.0 - 3.0 years

0 - 0 Lacs

Mumbai, Mumbai Suburban, Mumbai (All Areas)

Work from Office

the Bigoffast-pacedanygraduateJMeterEmployees should have a degree in computer science, IT, or a similar discipline, withbe ,Employees,the thema the Please goJOB DESCRIPTION JOB TITLE: Software engineer - QA JOB TYPE: MAIN DUTIES/RESPONSIBILITIES: Go through the software requirements and get clarifications on ones doubts Become familiar with the software under test Understand the master test plan and/ or the project plan Create or assist in creating test planone'sGenerate test cases based on the requirements and other documents Create or assist in creating assigned test automation Test software releases by executing assigned tests (manual and/ or automated) Report defects in defect tracking tool and assign it to the stakeholders Report test results to the stakeholders Re-test resolved defects Update test cases based on the discovered defects Update test automation based on the updated test cases Work closely with the Engineering teams and come up with test scenarios for new features SKILLS & EXPERIENCE Qualifications: Candidate should have a degree in computer science or IT or graduates from similar degree disciplines including relevant experience. Hands-on experience with automation tools like python and Pytest is a must. Hands-on experience with automation tools like Cypress, Selenium, Jmeter, etc would be an added advantage Experience: Previous Experience of 2-3 years in similar Python,. Must have skills: Understanding of Back-end processes and SQL queriesUnderstanding of API and its functioning(Should be aware of using Any API Testing Tool) Good Debugging skills, esp. involving distributed systems, preferably on Linux Good knowledge of test methodologies, including creation of test cases and test plans. Ability to build and maintain automated testing frameworks and automated test suites, in Python (pytest) Basic Knowledge of Public Clouds (AWS/Azure) Familiarity with DevOps technologies such as Docker, Kubernetes, Jenkins Excellent communication and collaboration skills. Comfortable working in fasfast-pacedt paced environments Basic knowledge of theBig Data ecosystem is an added advantage Knowledge on CURL commands

Posted 1 month ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

Hyderabad, Gurugram, Bengaluru

Hybrid

Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED. Contract To Hire(C2H) Role Location: Gurgaon/Bengaluru/Hyderabad Payroll: BCforward Work Mode: Hybrid JD Preferred Skills: 6+ years relevant experience into Big Data; ETL - Big Data / Data Warehousing; GCP; Java Skills: Big Data; ETL - Big Data / Data Warehousing; GCP; Java We are looking for a highly skilled Engineer with a solid experience of building Bigdata, GCP Cloud based real time data pipelines and REST APIs with Java frameworks. The Engineer will play a crucial role in designing, implementing, and optimizing data solutions to support our organizations data-driven initiatives. This role requires expertise in data engineering, strong problem-solving abilities, and a collaborative mindset to work effectively with various stakeholders. This role will be focused on the delivery of innovative solutions to satisfy the needs of our business. As an agile team we work closely with our business partners to understand what they require, and we strive to continuously improve as a team. Technical Skills 1. Core Data Engineering Skills Proficiency in using GCPs big data tools like: BigQuery : For data warehousing and SQL analytics . Dataproc : For running Spark and Hadoop clusters. GCP Dataflow : For stream and batch data processing.(High level Idea)GCP Pub/Sub: For real-time messaging and event ingestion.(High level Idea) Expertise in building automated, scalable, and reliable pipelines using custom Python/Scala solutions or Cloud Data Functions. 2. Programming and Scripting Strong coding skills in SQL, and Java. Familiarity with APIs and SDKs for GCP services to build custom data solutions. 3. Cloud Infrastructure Understanding of GCP services such as Cloud Storage, Compute Engine, and Cloud Functions. Familiarity with Kubernetes ( GKE )and containerization for deploying data pipelines. (Optional but Good to have) 4. DevOps and CI/ CD Experience setting up CI/CD pipelines using Cloud Build, GitHub Actions, or other tools. Monitoring and logging tools like Cloud Monitoringand Cloud Logging for production workflows. 5. Backend Development (Spring Boot & Java)-; Design and develop RESTful APIs and microservices using Spring Boot. Implement business logic, security, authentication (JWT/OAuth), and database operations. Work with relational databases (MySQL, PostgreSQL, MongoDB, Cloud SQL). Optimize backend performance, scalability, and maintainability. Implement unit testing and integration testing. Soft Skills 1. Innovation and Problem-Solving Ability to think creatively and design innovative solutions for complex data challenges. Experience in prototyping and experimenting with cutting-edge GCP tools or third-party integrations. Strong analytical mindset to transform raw data into actionable insights. 2. Collaboration Teamwork: Ability to collaborate effectively with data analysts, and business stakeholders. Communication: Strong verbal and written communication skills to explain technical concepts to non-technical audiences. 3. Adaptability and Continuous Learning Open to exploring new GCP features and rapidly adapting to changes in cloud technology. Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History. Interested applicants can share updated resume to g.sreekanth@bcforward.com Note: Looking for Immediate to 15-Days joiners at most. All the best

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Pune

Remote

Data Engineer Location: Remote (Any location within India) Skills: 5 years Data Engineering- Big Data Hadoop, HDFS, Hive, Spark SQL Unix/Linux Shell Scripting ETL experience- preferably Ab Initio

Posted 1 month ago

Apply

0.0 - 3.0 years

6 - 8 Lacs

Noida

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 1 month ago

Apply

5.0 - 10.0 years

3 - 5 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Work from Office

Azure Databricks Developer Job Title: Azure Databricks Developer Experience: 5+ Years Location: PAN India (Remote/Hybrid as per project requirement) Employment Type: Full-time Job Summary: We are hiring an experienced Azure Databricks Developer to join our dynamic data engineering team. The ideal candidate will have strong expertise in building and optimizing big data solutions using Azure Databricks, Spark, and other Azure data services. Key Responsibilities: Design, develop, and maintain scalable data pipelines using Azure Databricks and Apache Spark. Integrate and manage large datasets using Azure Data Lake, Azure Data Factory, and other Azure services. Implement Delta Lake for efficient data versioning and performance optimization. Collaborate with cross-functional teams including data scientists and BI developers. Ensure best practices for data security, governance, and compliance. Monitor performance and troubleshoot Spark clusters and data pipelines. Skills & Requirements: Minimum 5 years of experience in data engineering with at least 2+ years in Azure Databricks. Proficiency in Apache Spark (PySpark/Scala). Strong hands-on experience with Azure services ADF, ADLS, Synapse Analytics. Expertise in building and managing ETL/ELT pipelines. Strong SQL skills and experience with performance tuning. Experience with CI/CD pipelines and Azure DevOps is a plus. Good understanding of data modeling, partitioning, and data lake architecture. Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

0.0 - 1.0 years

8 - 10 Lacs

Hyderabad

Work from Office

Google Cloud Platform o GCS, DataProc, Big Query, Data Flow Programming Languages o Java, Scripting Languages like Python, Shell Script, SQL Google Cloud Platform o GCS, DataProc, Big Query, Data Flow 5+ years of experience in IT application delivery with proven experience in agile development methodologies 1 to 2 years of experience in Google Cloud Platform (GCS, DataProc, Big Query, Composer, Data Processing like Data Flow)

Posted 1 month ago

Apply

4.0 - 8.0 years

22 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 1 month ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Exp required 7+ yrs with data governance informatica tool Key Responsibilities: Data Governance Framework Development : Develop, implement, and maintain data governance frameworks, policies, and standards to ensure high-quality, consistent, and secure data across the organization. Collaborate with business units and stakeholders to define and enforce data governance policies, ensuring alignment with business goals and regulatory requirements. Data Quality Management : Define and enforce data quality standards, monitoring key data quality metrics. Identify, analyze, and resolve data quality issues across various data sources and platforms. Work with cross-functional teams to implement data quality improvement initiatives. Data Lineage & Metadata Management : Implement and maintain data lineage and metadata management solutions to ensure visibility and traceability of data throughout its lifecycle. Work with data architects and engineers to establish and document data flows, transformations, and dependencies. Data Security & Compliance : Ensure that data governance practices comply with relevant regulatory requirements (e.g., GDPR, CCPA, HIPAA). Implement data security controls to protect sensitive data and manage access to sensitive information. Stakeholder Collaboration : Partner with data architects, data engineers, data scientists, and business analysts to ensure alignment between technical and business needs for data governance. Provide training and support for teams on data governance policies, best practices, and tools. Data Governance Tools & Technologies : Lead the implementation and optimization of data governance tools and platforms. Continuously evaluate emerging tools and technologies to improve data governance processes. Reporting & Documentation : Develop and maintain comprehensive data governance documentation and reports. Provide regular updates to senior management on the status of data governance initiatives, risks, and areas of improvement. Requirements: Experience : 7+ years of experience in data governance, data management, or related fields. Proven track record in implementing data governance frameworks and policies at an enterprise level. In-depth knowledge of data governance concepts, including data quality, data lineage, metadata management, and data security. Technical Skills : Experience with data governance tools such as Collibra, Informatica, Alation, or similar. Strong understanding of databases, data warehousing, and big data platforms (e.g., Hadoop, Spark). Familiarity with data integration, ETL processes, and data modeling. Proficiency in SQL and other scripting languages (e.g., Python, Shell). Regulatory Knowledge : Solid understanding of data privacy and compliance regulations (GDPR, CCPA, HIPAA, etc.). Ability to assess and mitigate compliance risks related to data handling. Soft Skills : Excellent communication and interpersonal skills. Strong problem-solving skills and the ability to collaborate across teams. Ability to manage multiple projects and deadlines in a fast-paced environment. Roles and Responsibilities Exp required 7+ yrs with data governance informatica tool Key Responsibilities: Data Governance Framework Development : Develop, implement, and maintain data governance frameworks, policies, and standards to ensure high-quality, consistent, and secure data across the organization. Collaborate with business units and stakeholders to define and enforce data governance policies, ensuring alignment with business goals and regulatory requirements. Data Quality Management : Define and enforce data quality standards, monitoring key data quality metrics. Identify, analyze, and resolve data quality issues across various data sources and platforms. Work with cross-functional teams to implement data quality improvement initiatives. Data Lineage & Metadata Management : Implement and maintain data lineage and metadata management solutions to ensure visibility and traceability of data throughout its lifecycle. Work with data architects and engineers to establish and document data flows, transformations, and dependencies. Data Security & Compliance : Ensure that data governance practices comply with relevant regulatory requirements (e.g., GDPR, CCPA, HIPAA). Implement data security controls to protect sensitive data and manage access to sensitive information. Stakeholder Collaboration : Partner with data architects, data engineers, data scientists, and business analysts to ensure alignment between technical and business needs for data governance. Provide training and support for teams on data governance policies, best practices, and tools. Data Governance Tools & Technologies : Lead the implementation and optimization of data governance tools and platforms. Continuously evaluate emerging tools and technologies to improve data governance processes. Reporting & Documentation : Develop and maintain comprehensive data governance documentation and reports. Provide regular updates to senior management on the status of data governance initiatives, risks, and areas of improvement. Requirements: Experience : 7+ years of experience in data governance, data management, or related fields. Proven track record in implementing data governance frameworks and policies at an enterprise level. In-depth knowledge of data governance concepts, including data quality, data lineage, metadata management, and data security. Technical Skills : Experience with data governance tools such as Collibra, Informatica, Alation, or similar. Strong understanding of databases, data warehousing, and big data platforms (e.g., Hadoop, Spark). Familiarity with data integration, ETL processes, and data modeling. Proficiency in SQL and other scripting languages (e.g., Python, Shell). Regulatory Knowledge : Solid understanding of data privacy and compliance regulations (GDPR, CCPA, HIPAA, etc.). Ability to assess and mitigate compliance risks related to data handling. Soft Skills : Excellent communication and interpersonal skills. Strong problem-solving skills and the ability to collaborate across teams. Ability to manage multiple projects and deadlines in a fast-paced environment.

Posted 1 month ago

Apply

7.0 - 12.0 years

25 - 27 Lacs

Hyderabad

Work from Office

3+ years experienced engineer who has worked on GCP environment and its relevant tools/services. (Big Query, Data Proc, Data flow, Cloud Storage, Terraform, Tekton , Cloudrun , Cloud scheduler, Astronomer/Airflow, Pub/sub, Kafka, Cloud Spanner streaming etc) 1 or 2 + years of strong experience in Python development (Object oriented/Functional Programming, Pandas, Pyspark etc) 1 or 2 + years of strong experience in SQL Language (CTEs, Window functions, Aggregate functions etc)

Posted 1 month ago

Apply

5.0 - 9.0 years

7 - 11 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

5-10 years of experience in database development or a related field. Proven experience with database design, development, and management. Experience working with large-scale databases and complex data environments Experience with data modelling and database design. Knowledge of database performance tuning and optimization. Architect, Develop and maintain tables, views, procedures, functions and packages in Database MUST HAVE Performing complex relational databases queries using SQL (AWS RDS for PostgreSQL) and Oracle PLSQL MUST HAVE Familiarity with ETL processes and tools (AWS Batch, AWS Glue etc) MUST HAVE Familiarity with CI/CD Pipelines, Jenkins Deployment, Git Repository MUST HAVE Perform in performance tuning. Proactively monitor the database systems to ensure secure services with minimum downtime and improve maintenance of the databases to include rollouts, patching, and upgrades. Experience with Aurora's scaling and replication capabilities. MUST HAVE Proficiency with AWS CloudWatch for monitoring database performance and setting up alerts. Experience with performance tuning and optimization in AWS environments MUST HAVE Experience using Confluence for documentation and collaboration. Proficiency in using SmartDraw for creating database diagrams, flowcharts, and other visual representations of data models and processes. MUST HAVE Proficiency in using libraries such as Pandas and NumPy for data manipulation, analysis, and transformation. Experience with libraries like SQLAlchemy and PyODBC for connecting to and interacting with various databases. MUST HAVE Python programming language MUST HAVE Agile/Scrum , Communication (Spoken English, clarity of thought) Big Data, Data mining, machine learning and natural language processing

Posted 1 month ago

Apply

2.0 - 4.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Overview Annalect is currently seeking a senior frontend engineer to join our technology team remotely. In this role, you will build Annalect products which sit atop our Big Data infrastructure and utilize our componentized design system. In 2019 we adopted Web Components to help us build web applications in a modular and reusable way. We're looking for people who have a shared passion for data and desire to build cool, maintainable and high-quality applications to use this data. In this role you will participate in implementing our technical architecture, develop software products, and collaborate with frontend engineers from other tracks. Annalect is currently seeking a senior frontend engineer to join our engineering team remotely. Responsibilities Desiging, building, testing and deploying scalable, reusable and maintainable applications that handle large amounts of data. Write new UI components across various applications using Web Components. Perform code reviews and provide leadership and guidance to junior engineersa. Possess a strong ability to learn and teach new technologies. Qualifications ~5 years of solid coding experience with ES6+ Javascript. Experience with the latest web standards, including HTML5 and CSS3. Experience building applications using modern tooling.

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Noida

Work from Office

With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. We are looking for a talented and experienced Sr Software Engineer to join our dynamic team. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. We are seeking engineers with diverse specialties and skills to join our dynamic team to innovate and solve complex challenges. Our team is looking for strong talent with expertise in the following areas:Front End UI Engineer (UI/UX design principles, responsive design, JavaScript frameworks)DevOps Engineer (CI/CD Pipelines, IAC proficiency, Containerization/Orchestration, Cloud Platforms)Back End Engineer (API Development, Database Management, Security Practices, Message Queuing)AI/ML Engineer (Machine Learning Frameworks, Data Processing, Algorithm Development, Big Data Technologies, Domain Knowledge)Responsibilities:Software DevelopmentWrite clean, maintainable, and efficient code or various software applications and systems. Design and ArchitectureParticipate in design reviews with peers and stakeholdersCode ReviewReview code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelinesTestingBuild testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and TroubleshootingTriage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and QualityContribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops ModelUnderstanding of working in a DevOps Model.Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. DocumentationProperly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications:Bachelor’s degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency in one or more programming languages such as C, C++, C#, .NET, Python, Java, or JavaScript. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications:Experience with cloud platforms like Azure, AWS, or GCP. Experience with test automation frameworks and tools. Knowledge of agile development methodologies. Commitment to continuous learning and professional development. Good communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKGCareers@ukg.com

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 9 Lacs

Pune

Work from Office

With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. We are looking for a talented and experienced Senior Software Engineer to join our dynamic team. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. We are seeking engineers with diverse specialties and skills to join our dynamic team to innovate and solve complex challenges. Our team is looking for strong talent with expertise in the following areas: Front End UI Engineer (UI/UX design principles, responsive design, JavaScript frameworks) DevOps Engineer (CI/CD Pipelines, IAC proficiency, Containerization/Orchestration, Cloud Platforms) Back End Engineer (API Development, Database Management, Security Practices, Message Queuing) AI/ML Engineer (Machine Learning Frameworks, Data Processing, Algorithm Development, Big Data Technologies, Domain Knowledge) Responsibilities: Software DevelopmentWrite clean, maintainable, and efficient code or various software applications and systems. Design and ArchitectureParticipate in design reviews with peers and stakeholders Code ReviewReview code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines TestingBuild testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and TroubleshootingTriage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and QualityContribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops ModelUnderstanding of working in a DevOps Model.Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. DocumentationProperly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency in one or more programming languages such as C, C++, C#, .NET, Python, Java, or JavaScript. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with cloud platforms like Azure, AWS, or GCP. Experience with test automation frameworks and tools. Knowledge of agile development methodologies. Commitment to continuous learning and professional development. Good communication and interpersonal skills, with the ability to work effectively in a collaborative team environment. Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKGCareers@ukg.com

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies