Jobs
Interviews

6093 Scala Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Software Engineer at Mastercard, you will be responsible for providing support for application software through programming, analysis, design, development, and delivery of software solutions. You will collaborate with stakeholders to understand business needs and ensure that project implementations and technical deliveries align with solution architectural design and best practices. Your major accountabilities will include participating in designing highly scalable, fault-tolerant, and performant systems in the cloud, providing technical guidance to project teams, and analyzing ITSM activities to enhance operational efficiency. You will also be responsible for maintaining services, scaling systems sustainably through automation, and practicing sustainable incident response strategies. To excel in this role, you should have a Bachelor's degree in Information Technology, Computer Science, or relevant field, along with at least 6 years of hands-on software development experience. You should possess strong knowledge of Java 8 or later, experience with databases (both relational and NoSQL), and familiarity with DevOps and IT Operations best practices. Additionally, experience with data analytics, ETL, data modeling, and pattern analysis is desired. The ideal candidate will be willing to learn new technology stacks, collaborate with global teams across different time zones, and stay updated with new technologies through self-study and continuous learning. Strong written and spoken communication skills are essential for effectively explaining technical issues and solution strategies to stakeholders. If you are a creative problem-solver, agile thinker, and innovative risk-taker who thrives in a collaborative environment where agility is key, then this role is perfect for you. Join us at Mastercard and be part of a team that is revolutionizing the ecommerce landscape and reshaping the future of fraud prevention through collaboration and innovation.,

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bhubaneswar, Odisha, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to guarantee the quality and functionality of the applications you create, while continuously seeking ways to enhance existing systems and processes. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Good to have skills - Pyspark, AWS, Airflow, Databricks, SQL, SCALA - Experience should be 4+ years in Python - Candidate must be a strong Hands-on senior Developer - Candidate must possess good technical / non-technical communication skills to highlight areas of concern/risks - Should have good troubleshooting skills to do RCA of prod support related issues Additional Information: - The candidate should have minimum 3 years of experience in Python (Programming Language). - This position is based at our Bengaluru office. - A 15 years full time education is required. - Candidate must be willing to work in Shift B i.e. daily 9PM/10 PM IST, 15 years full time education

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an Infoscion, your primary responsibility is to interact with clients to address quality assurance issues and ensure utmost customer satisfaction. You will be involved in understanding requirements, creating and reviewing designs, validating architecture, and delivering high-quality service offerings in the technology domain. Participating in project estimation, providing inputs for solution delivery, conducting technical risk planning, performing code reviews, and unit test plan reviews are crucial aspects of your role. Leading and guiding your teams towards developing optimized code deliverables, continuous knowledge management, and adherence to organizational guidelines and processes are also key responsibilities. You will play a significant role in building efficient programs and systems. If you believe you have the skills to assist clients in their digital transformation journey, this is the ideal place for you to thrive. In addition to the primary responsibilities, you are expected to have knowledge of multiple technologies, basic understanding of architecture and design fundamentals, familiarity with testing tools, and agile methodologies. Understanding project life cycle activities, estimation methodologies, quality processes, and business domains is essential. Analytical abilities, strong technical skills, good communication skills, and a deep understanding of technology and domains are also required. Furthermore, you should be able to demonstrate a solid understanding of software quality assurance principles, SOLID design principles, and modeling methods. Keeping abreast of the latest technologies and trends, and possessing excellent problem-solving, analytical, and debugging skills are highly valued. Preferred Skills: - Technology: Functional Programming - Scala,

Posted 1 week ago

Apply

12.0 - 18.0 years

0 Lacs

noida, uttar pradesh

On-site

As a seasoned Manager - Data Engineering with 12-18 years of total experience in data engineering, including 3-5 years in a leadership/managerial role, you will lead complex data platform implementations using Databricks or the Apache data stack. Your key responsibilities will include leading high-impact data engineering engagements for global clients, delivering scalable solutions, and driving digital transformation. You must have hands-on experience in Databricks OR core Apache stack (Spark, Kafka, Hive, Airflow, NiFi, etc.) and expertise in one or more cloud platforms such as AWS, Azure, or GCP, ideally with Databricks on cloud. Strong programming skills in Python, Scala, and SQL are essential, along with experience in building scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is also required. Your role will involve leading the architecture, development, and deployment of modern data platforms using Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will design and implement data pipelines (batch and real-time), data lakehouses, and large-scale ETL frameworks. Additionally, you will own delivery accountability for data engineering programs across BFSI, telecom, healthcare, or manufacturing clients. Collaboration with global stakeholders, product owners, architects, and business teams to understand requirements and deliver data-driven outcomes will be a key part of your responsibilities. Ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance is crucial. You will manage and mentor a team of 10-25 engineers, conducting performance reviews, capability building, and coaching. Supporting presales activities including solutioning, technical proposals, and client workshops will also be part of your role. At GlobalLogic, we prioritize a culture of caring and continuous learning and development. You'll have the opportunity to work on interesting and meaningful projects that have a real impact. We offer balance and flexibility, ensuring that you can achieve the perfect equilibrium between work and life. As a high-trust organization, integrity is key, and you can trust that you are part of a safe, reliable, and ethical global company. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. As part of our team, you'll collaborate with forward-thinking companies to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

karnataka

On-site

As a GenAI Data Scientist at PwC US - Acceleration Center, you will be responsible for developing and implementing machine learning models and algorithms for GenAI projects. Your role will involve collaborating with product, engineering, and domain experts to identify high-impact opportunities, designing and building GenAI and Agentic AI solutions, processing structured and unstructured data for LLM workflows, validating and evaluating models, containerizing and deploying production workloads, communicating findings via various mediums, and staying updated with GenAI advancements. You should possess a Bachelor's or Master's degree in Computer Science, Data Science, Statistics, or a related field, along with 1-2 years of hands-on experience delivering GenAI solutions and 3-5 years of deploying machine learning solutions in production environments. Proficiency in Python, experience with vector stores and search technologies, familiarity with LLM-backed agent frameworks, expertise in data preprocessing and feature engineering, competence with cloud services, solid grasp of Git workflows and CI/CD pipelines, and proficiency in data visualization are essential requirements. Additionally, having relevant certifications in GenAI tools, hands-on experience with leading agent orchestration platforms, expertise in chatbot design, practical knowledge of ML/DL frameworks, and proficiency in object-oriented programming with languages like Java, C++, or C# are considered as nice-to-have skills. The ideal candidate should possess strong problem-solving skills, a collaborative mindset, and the ability to thrive in a fast-paced environment. If you are passionate about leveraging data to drive insights and make informed business decisions, this role offers an exciting opportunity to contribute to cutting-edge GenAI projects and drive innovation in the field of data science.,

Posted 1 week ago

Apply

12.0 - 18.0 years

0 Lacs

noida, uttar pradesh

On-site

We are looking for an experienced Manager Data Engineering with expertise in Databricks or the Apache data stack to lead complex data platform implementations. As the Manager Data Engineering, you will play a crucial role in spearheading high-impact data engineering projects for global clients, delivering scalable solutions, and catalyzing digital transformation. You should have a total of 12-18 years of experience in data engineering, with at least 3-5 years in a leadership or managerial capacity. Hands-on experience in Databricks or core Apache stack components such as Spark, Kafka, Hive, Airflow, NiFi, etc., is essential. Proficiency in one or more cloud platforms like AWS, Azure, or GCP is preferred, ideally with Databricks on the cloud. Strong programming skills in Python, Scala, and SQL are required, along with experience in building scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is advantageous. Your responsibilities will include leading the architecture, development, and deployment of modern data platforms utilizing Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will design and implement data pipelines (batch and real-time), data lakehouses, and large-scale ETL frameworks, ensuring delivery accountability for data engineering programs across various industries. Collaboration with global stakeholders, product owners, architects, and business teams to understand requirements and deliver data-driven outcomes will be a key aspect of your role. Additionally, you will be responsible for ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance. Managing and mentoring a team of 10-25 engineers, conducting performance reviews, capability building, and coaching will also be part of your responsibilities. At GlobalLogic, we prioritize a culture of caring where people come first. You will have opportunities for continuous learning and development, engaging in interesting and meaningful work that makes an impact. We believe in providing balance and flexibility to help you integrate your work and life effectively. GlobalLogic is a high-trust organization built on integrity and ethical values, providing a safe and reliable environment for your professional growth and success. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for collaborating with leading companies worldwide to create innovative digital products and experiences. Join us to be a part of transforming businesses and redefining industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

haryana

On-site

As a Data Engineering Specialist, you will be responsible for assessing, capturing, and translating complex business issues into structured technical tasks for the data engineering team. This includes designing, building, launching, optimizing, and extending full-stack data and business intelligence solutions. Your role will involve supporting the build of big data environments, focusing on improving data pipelines and data quality, and working with stakeholders to meet business needs. You will create data access tools for the analytics and data scientist team, conduct code reviews, assist other developers, and train team members as required. Additionally, you will ensure that developed systems comply with industry standards and best practices while meeting project requirements. To excel in this role, you should possess a Bachelor's degree in computer science engineering or equivalent, or relevant experience. Certification in cloud technologies, especially Azure, would be beneficial. You should have 2-3+ years of development experience in building and maintaining ETL/ELT pipelines on various sources and operational programming tasks. Experience with Apache data projects or cloud platform equivalents and proficiency in programming languages like Python, Scala, R, Java, Golang, Kotlin, C, or C++ is required. Your work will involve collaborating closely with data scientists, machine learning engineers, and stakeholders to understand requirements and develop data-driven solutions. Troubleshooting, debugging, and resolving issues within generative AI system development, as well as documenting processes, specifications, and training procedures will be part of your responsibilities. In summary, this role requires a strong background in data engineering, proficiency in cloud technologies, experience with data projects and programming languages, and the ability to collaborate effectively with various stakeholders to deliver high-quality data solutions.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Lead Software Engineer at JPMorgan Chase within the Consumer & Community Banking Team, you play a crucial role in an agile team dedicated to enhancing, building, and delivering trusted market-leading technology products with a focus on security, stability, and scalability. Your responsibilities include executing creative software solutions, designing, developing, and troubleshooting technical issues with an innovative mindset. You are expected to develop high-quality, secure production code, review and debug code by team members, and identify opportunities to automate remediation processes for enhanced operational stability. In this role, you will lead evaluation sessions with external vendors, startups, and internal teams to assess architectural designs and technical applicability within existing systems. Additionally, you will drive awareness and adoption of new technologies within Software Engineering communities, contributing to a diverse, inclusive, and respectful team culture. To excel in this position, you should possess formal training or certification in software engineering concepts along with at least 5 years of practical experience. Strong proficiency in database systems, including SQL & NoSQL, and programming languages like Python, Java, or Scala is essential. Experience in data architecture, data modeling, data warehousing, and data lakes, as well as implementing complex ETL transformations on big data platforms, will be beneficial. Proficiency in the Software Development Life Cycle and agile methodologies such as CI/CD, Application Resiliency, and Security is required. An ideal candidate will have hands-on experience with software applications and technical processes within a specific discipline (e.g., cloud, artificial intelligence, machine learning) and a background in the financial services industry. Practical experience in cloud-native technologies is highly desirable. Additional qualifications such as Java and data programming experience are considered a plus for this role.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

We are seeking a Senior Staff Engineer to contribute to the development of innovative programs and products tailored to meet the requirements of Experian's clients, particularly those in the financial services sector. Our focus includes addressing critical questions such as enhancing the robustness and scalability of our loan origination modeling approach and addressing crucial aspects in the lending industry such as bias, fairness, and explainable AI. As an evolving team, we embody a startup mindset within the framework of a larger organization. Our emphasis lies on agility, impact, and transformation of the organizational culture around us through our outcomes and operational methodologies. Your responsibilities will include taking on challenging assignments within the development teams, offering substantial technical expertise across the entire development cycle, and guiding junior team members. You will play a pivotal role in executing technical and business strategies, ensuring the achievement of functional objectives, and comprehensively supporting products by grasping the interconnection of various components. In this role, you will be instrumental in supporting complex software development projects by contributing to planning, system design, and mentoring junior developers. You will be expected to innovate and architect solutions for intricate technical issues or system enhancements, as well as steer the technical direction for product development, encompassing technology selection and enhancement plans. Your tasks will involve developing Java and Scala components for our analytics product platforms on AWS, actively engaging with the platform and applications, and collaborating with geographically dispersed cross-functional teams to elevate the value of Analytics offerings. Additionally, you will be involved in enhancing the product to optimize cost efficiency while maximizing scalability and stability. You will be reporting to a Senior Manager, with your primary workplace being in Hyderabad, and a requirement for working from office two days a week for a Hybrid work model. Key Skills Required: - Proficiency in distributed data processing frameworks like Spark - Familiarity with public cloud platforms such as AWS, Azure, GCP (preferably AWS) - Experience with Docker, Kubernetes, CI/CD pipelines, and observability tools - Hands-on expertise in Scala, Java, and Python Qualifications: - Over 10 years of industry experience in object-oriented programming and asynchronous programming - Bachelor's degree in computer science or a related field We welcome individuals who are passionate about leveraging their technical expertise to drive innovation and contribute to the growth and success of our dynamic team.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

About GlobalLogic GlobalLogic, a leader in digital product engineering with over 30,000 employees, helps brands worldwide in designing and developing innovative products, platforms, and digital experiences. By integrating experience design, complex engineering, and data expertise, GlobalLogic assists clients in envisioning possibilities and accelerating their transition into the digital businesses of tomorrow. Operating design studios and engineering centers globally, GlobalLogic extends its deep expertise to customers in various industries such as communications, financial services, automotive, healthcare, technology, media, manufacturing, and semiconductor. GlobalLogic is a Hitachi Group Company. Requirements Leadership & Strategy As a part of GlobalLogic, you will lead and mentor a team of cloud engineers, providing technical guidance and support for career development. You will define cloud architecture standards and best practices across the organization and collaborate with senior leadership to develop a cloud strategy and roadmap aligned with business objectives. Your responsibilities will include driving technical decision-making for complex cloud infrastructure projects and establishing and maintaining cloud governance frameworks and operational procedures. Leadership Experience With a minimum of 3 years in technical leadership roles managing engineering teams, you should have a proven track record of successfully delivering large-scale cloud transformation projects. Experience in budget management, resource planning, and strong presentation and communication skills for executive-level reporting are essential for this role. Certifications (Preferred) Preferred certifications include Google Cloud Professional Cloud Architect, Google Cloud Professional Data Engineer, and additional relevant cloud or security certifications. Technical Excellence You should have over 10 years of experience in designing and implementing enterprise-scale Cloud Solutions using GCP services. As a technical expert, you will architect and oversee the development of sophisticated cloud solutions using Python and advanced GCP services. Your role will involve leading the design and deployment of solutions utilizing Cloud Functions, Docker containers, Dataflow, and other GCP services. Additionally, you will design complex integrations with multiple data sources and systems, implement security best practices, and troubleshoot and resolve technical issues while establishing preventive measures. Job Responsibilities Technical Skills Your expertise should include expert-level proficiency in Python and experience in additional languages such as Java, Go, or Scala. Deep knowledge of GCP services like Dataflow, Compute Engine, BigQuery, Cloud Functions, and others is required. Advanced knowledge of Docker, Kubernetes, and container orchestration patterns, along with experience in cloud security, infrastructure as code, and CI/CD practices, will be crucial for this role. Cross-functional Collaboration Collaborating with C-level executives, senior architects, and product leadership to translate business requirements into technical solutions, leading cross-functional project teams, presenting technical recommendations to executive leadership, and establishing relationships with GCP technical account managers are key aspects of this role. What We Offer At GlobalLogic, we prioritize a culture of caring, continuous learning and development, interesting and meaningful work, balance and flexibility, and a high-trust organization. Join us to experience an inclusive culture, opportunities for growth and advancement, impactful projects, work-life balance, and a safe, reliable, and ethical global company. About GlobalLogic GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences since 2000. Collaborating with forward-thinking companies globally, GlobalLogic continues to transform businesses and redefine industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Java with Hadoop Developer at Airlinq in Gurgaon, India, you will play a vital role in collaborating with the Engineering and Development teams to establish and maintain a robust testing and quality program for Airlinq's products and services. Your responsibilities will include but are not limited to: - Being part of a team focused on creating end-to-end IoT solutions using Hadoop to address various industry challenges. - Building quick prototypes and demonstrations to showcase the value of technologies such as IoT, Machine Learning, Cloud, Micro-Services, DevOps, and AI to the management. - Developing reusable components, frameworks, and accelerators to streamline the development cycle of future IoT projects. - Operating effectively with minimal supervision and guidance. - Configuring Cloud platforms for specific use-cases. To excel in this role, you should have a minimum of 3 years of IT experience with at least 2 years dedicated to working with Cloud technologies like AWS or Azure. You must possess expertise in designing and implementing highly scalable enterprise applications and establishing continuous integration environments on the targeted cloud platform. Proficiency in Java, Spring Framework, and strong knowledge of IoT principles, connectivity, security, and data streams are essential. Familiarity with emerging technologies such as Big Data, NoSQL, Machine Learning, AI, and Blockchain is also required. Additionally, you should be adept at utilizing Big Data technologies like Hadoop, Pig, Hive, and Spark, with hands-on experience in any Hadoop platform. Experience in workload migration between on-premise and cloud environments, programming with MapReduce and Spark, as well as Java (core Java), J2EE technologies, Python, Scala, Unix, and Bash Scripts is crucial. Strong analytical, problem-solving, and research skills are necessary, along with the ability to think innovatively and independently. This position requires 3-7 years of relevant work experience and is based in Gurgaon. The ideal educational background includes a B.E./B.Tech., M.E./M. Tech. in Computer Science, Electronics Engineering, or MCA.,

Posted 1 week ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Risk Management Level Associate Job Description & Summary In-depth knowledge of application development processes and at least one programming and one scripting language (e.g., Java, Scala, C#, JavaScript, Angular, ReactJs, Ruby, Perl, Python, Shell). •Knowledge on OS security (Windows, Unix/Linux systems, Mac OS, VMware), network security and cloud security. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: We are seeking a professional to join our Cybersecurity and Privacy services team, where you will have the opportunity to help clients implement effective cybersecurity programs that protect against threats. Responsibilities: L1 - Minimum 2 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/ Vulnerability Management/ SOC platform management/ Automation/Asset Integration/ Threat Intel Management /Threat Hunting. L2 - Minimum 4 years of relevant experience in SOC/Incident Management/Incident Response /Threat Detection Engineering/Vulnerability Management/ SOC platform management/ Automation/ Asset Integration/ Threat Intel Management/Threat Hunting. · Round the clock threat monitoring & detection · Analysis of any suspicious, malicious, and abnormal behavior. · Alert triage, Initial assessment, incident validation, its severity & urgency · Prioritization of security alerts and creating Incidents as per SOPs. · Reporting & escalation to stakeholders · Post-incident Analysis · Consistent incident triage & recommendations using playbooks. · Develop & maintain incident management and incident response policies and procedures. · Preservation of security alerts and security incidents artefacts for forensic purpose. · Adherence to Service Level Agreements (SLA) and KPIs. · Reduction in Mean Time to Detection and Response (MTTD & MTTR). Mandatory skill sets: Certified SOC Analyst (EC-Council), Computer Hacking Forensic Investigator (EC-Council), Certified Ethical Hacker (EC-Council), CompTIA Security+, CompTIA CySA+ (Cybersecurity Analyst), GIAC Certified Incident Handler (GCIH) or equivalent. Product Certifications (Preferred): - Product Certifications on SOC Security Tools such as SIEM/Vulnerability Management/ DAM/UBA/ SOAR/NBA etc. Preferred skill sets: SOC - Splunk Years of experience required: 2-5 Years Education qualification: B.Tech/MCA/MBA with IT background/ Bachelor’s degree in Information Technology, Cybersecurity, Computer Science Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills SoCs Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Methodology, Azure Data Factory, Communication, Cybersecurity, Cybersecurity Framework, Cybersecurity Policy, Cybersecurity Requirements, Cybersecurity Strategy, Emotional Regulation, Empathy, Encryption Technologies, Inclusion, Intellectual Curiosity, Managed Services, Optimism, Privacy Compliance, Regulatory Response, Security Architecture, Security Compliance Management, Security Control, Security Incident Management, Security Monitoring {+ 3 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

indore, madhya pradesh

On-site

Are you passionate about building scalable data pipelines and working with real-time streaming platforms Join our growing team as a Data Engineer and help power next-gen data solutions! As a Data Engineer, you will be responsible for designing and maintaining real-time data pipelines using Apache Kafka. You will write efficient and optimized SQL queries for data extraction and transformation. Additionally, you will build robust ETL/ELT processes for structured & unstructured data and collaborate with analysts, data scientists & developers to deliver insights. Ensuring data quality, security & performance optimization will be a crucial part of your role. Integration with tools like Spark, Airflow, or Snowflake (as applicable) will also be part of your responsibilities. We value proficiency in Apache Kafka, Kafka Streams or Kafka Connect, strong skills in SQL, Python/Scala, and cloud platforms (AWS/GCP/Azure), experience with data lakes, message queues, and large-scale systems, as well as a problem-solving mindset and a passion for clean, efficient code. Working with us will involve exciting projects with global clients, a collaborative and innovation-driven environment, flexible working options, and competitive compensation. If you are excited about this opportunity, apply now at yukta.sengar@in.spundan.com or tag someone perfect for this role!,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Dynatrace Developer/Consultant, you will be responsible for setting up and maintaining monitoring systems to track the health and performance of data pipelines. Your role will involve configuring alerts and notifications to promptly identify and respond to issues or anomalies in data pipelines. You will develop procedures and playbooks for incident response and resolution, collaborating with data engineers to optimize data flows and processing. Your experience in working with data, ETL, Data warehouse & BI Projects will be invaluable as you continuously monitor and analyze pipeline performance to identify bottlenecks and areas for improvement. Implementing logging mechanisms and error handling strategies will be crucial to capture and analyze pipeline features for quick detection and troubleshooting. Working closely with data engineers and data analysts, you will monitor data quality metrics, delete data anomalies, and develop processes to address data quality issues. Forecasting resource requirements based on data growth and usage patterns will ensure that pipelines can handle increasing data volumes without performance degradation. Developing and maintaining dashboards and reports to visualize key pipeline performance metrics will provide stakeholders with insights into system health and data flow. Automating monitoring tasks and developing tools for streamlined management and observability of data pipelines will be part of your responsibilities. Ensuring data pipeline observability aligns with security and compliance standards, such as data privacy regulations and industry best practices, will be crucial. You will document monitoring processes, best practices, and system configurations, sharing knowledge with team members to improve overall data pipeline reliability and efficiency. Collaborating with cross-functional teams, including data engineers, data scientists, and IT operations, you will troubleshoot issues and implement improvements. Keeping abreast of the latest developments in data pipeline monitoring and observability technologies and practices will enable you to recommend and implement advancements. Knowledge in AWS Glue, S3, Athena is a nice-to-have, along with experience in JIRA and knowledge in any programming language such as Python, Java, or Scala. This is a full-time position with a Monday to Friday schedule and in-person work location.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Engineer at our organization, you will be responsible for designing, implementing, and maintaining data pipelines and data integration solutions using Azure Synapse. Your role will involve developing and optimizing data models and data storage solutions on Azure. You will collaborate closely with data scientists and analysts to implement data processing and data transformation tasks. Ensuring data quality and integrity through data validation and cleansing methodologies will be a key aspect of your responsibilities. Your duties will also include monitoring and troubleshooting data pipelines to identify and resolve performance issues promptly. Collaboration with cross-functional teams to understand and prioritize data requirements will be essential. It is expected that you stay up-to-date with the latest trends and technologies in data engineering and Azure services to contribute effectively to the team. To be successful in this role, you are required to possess a Bachelor's degree in IT, computer science, computer engineering, or a related field, along with a minimum of 8 years of experience in Data Engineering. Proficiency in Microsoft Azure Synapse Analytics is crucial, including experience with Azure Data Factory, Dedicated SQL Pool, Lake Database, and Azure Storage. Hands-on experience in Spark notebooks (Python or Scala) is mandatory for this position. Your expertise should also cover end-to-end Data Warehouse experience, including ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and data security. Advanced SQL and relational database knowledge, as well as demonstrated experience in designing and delivering data platforms for Business Intelligence and Data Warehouse, are required skills. Strong analytical abilities to handle and analyze complex, high-volume data with attention to detail are essential. Familiarity with data modeling and data warehousing concepts such as DataVault or 3NF, along with experience in Data Governance (Quality, Lineage, Data dictionary, and Security), is preferred. Knowledge of Agile methodology and working environment is beneficial for this role. You should also exhibit the ability to work independently with Product Owners, Business Analysts, and Architects. Join us at NTT DATA Business Solutions, where we empower you to transform SAP solutions into value. If you have any questions regarding this job opportunity, please reach out to our Recruiter, Pragya Kalra, at Pragya.Kalra@nttdata.com.,

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description Design, develop, and maintain high-quality software solutions. Collaborate with cross-functional teams to define, design, and ship new features. Strong Programming knowledge includes design patterns and debugging Java or Scala Design and Implement Data Engineering Frameworks on HDFS, Spark and EMR Implement and manage Kafka Streaming and containerized microservices. Work with RDBMS (Aurora MySQL) and No-SQL (Cassandra) databases. Utilize AWS Cloud services such as S3, EFS, MSK, ECS, EMR, etc. Ensure the performance, quality, and responsiveness of applications. Troubleshoot and resolve software defects and issues. Write clean, maintainable, and efficient code. Participate in code reviews and contribute to team knowledge sharing. You will be reporting to a Senior Manager This role would require you to work from Hyderabad (Workplace) for Hybrid 2 days a week from Office Qualifications 5+ years experienced engineer with hands-on and strong coding skills, preferably with Scala and java. Experience with Data Engineering – BigData, EMR, Airflow, Spark, Athena. AWS Cloud experience – S3, EFS, MSK, ECS, EMR, etc. Experience with Kafka Streaming and containerized microservices. Knowledge and experience with RDBMS (Aurora MySQL) and No-SQL (Cassandra) databases. Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here

Posted 1 week ago

Apply

2.0 - 4.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Training-related experience Must have Teaching experience: conducting training sessions in classroom and dynamically responding to different capabilities of learners; experience in analyzing the feedback from sessions and identifying action areas for self-improvement Developing teaching material: Experience in developing teaching material, including exercises and assignments Good presentation skills, excellent oral / written communication skills Nice to have Teaching experience: Experience in delivering session over virtual classrooms Instructional Design: Developing engaging content Designing Assessments: Experience in designing assessments to evaluate the effectiveness of training and gauging the proficiency of the learner Participated in activities of the software development lifecycle like development, testing, configuration management ob Responsibilities Develop teaching materials including exercises & assignments Conduct classroom training / virtual training Design assessments Enhance course material & course delivery based on feedback to improve training effectiveness Location: Mysore, Mangalore, Bangalore, Chennai, Pune, Hyderabad, Chandigarh Description of the Profile We are looking for trainers with 2 to 4 years of teaching experience and technology know-how in one or more of the following areas: Java – Java programming, Spring, Angular / React, Bootstrap Microsoft – C# programming, SQL Server, ADO.NET, ASP.NET, MVC design pattern, Azure, MS Power platforms, MS Dynamics 365 CRM, MS Dynamics 365 ERP, SharePoint Testing – Selenium, Microfocus - UFT, Microfocus-ALM tools, SOA testing, SOAPUI, Rest assured, Appium Big Data – Python programming, Hadoop, Spark, Scala, Mongo DB, NoSQL SAP – SAP ABAP programming / SAP MM / SAP SD /SAP BI / SAP S4 HANA Oracle – Oracle E-Business Suite (EBS) / PeopleSoft / Siebel CRM / Oracle Cloud / OBIEE / Fusion Middleware API and integration – API, Microservices, TIBCO, APIGee, Mule Digital Commerce – SalesForce, Adobe Experience Manager Digital Process Automation - PEGA, Appian, Camunda, Unqork, UIPath MEAN / MERN stacks Business Intelligence – SQL Server, ETL using SQL Server, Analysis using SQL Server, Enterprise reporting using SQL, Visualization Data Science – Python for data science, Machine learning, Exploratory data analysis, Statistics & Probability Cloud & Infrastructure Management – Network administration / Database administration / Windows administration / Linux administration / Middleware administration / End User Computing / ServiceNow Cloud platforms like AWS / GCP/ Azure / Oracle Cloud, Virtualization Cybersecurity - Infra Security / Identity & Access Management / Application Security / Governance & Risk Compliance / Network Security Mainframe – COBOL, DB2, CICS, JCL Open source – Python, PHP, Unix / Linux, MySQL, Apache, HTML5, CSS3, JavaScript DBMS – Oracle / SQL Server / MySQL / DB2 / NoSQL Design patterns, Agile, DevOps

Posted 1 week ago

Apply

2.0 - 4.0 years

0 Lacs

Mysore, Karnataka, India

On-site

Training-related experience Must have Teaching experience: conducting training sessions in classroom and dynamically responding to different capabilities of learners; experience in analyzing the feedback from sessions and identifying action areas for self-improvement Developing teaching material: Experience in developing teaching material, including exercises and assignments Good presentation skills, excellent oral / written communication skills Nice to have Teaching experience: Experience in delivering session over virtual classrooms Instructional Design: Developing engaging content Designing Assessments: Experience in designing assessments to evaluate the effectiveness of training and gauging the proficiency of the learner Participated in activities of the software development lifecycle like development, testing, configuration management Job Responsibilities Develop teaching materials including exercises & assignments Conduct classroom training / virtual training Design assessments Enhance course material & course delivery based on feedback to improve training effectiveness Location: Mysore, Mangalore, Bangalore, Chennai, Pune, Hyderabad, Chandigarh Description of the Profile We are looking for trainers with 2 to 4 years of teaching experience and technology know-how in one or more of the following areas: Java – Java programming, Spring, Angular / React, Bootstrap Microsoft – C# programming, SQL Server, ADO.NET, ASP.NET, MVC design pattern, Azure, MS Power platforms, MS Dynamics 365 CRM, MS Dynamics 365 ERP, SharePoint Testing – Selenium, Microfocus - UFT, Microfocus-ALM tools, SOA testing, SOAPUI, Rest assured, Appium Big Data – Python programming, Hadoop, Spark, Scala, Mongo DB, NoSQL SAP – SAP ABAP programming / SAP MM / SAP SD /SAP BI / SAP S4 HANA Oracle – Oracle E-Business Suite (EBS) / PeopleSoft / Siebel CRM / Oracle Cloud / OBIEE / Fusion Middleware API and integration – API, Microservices, TIBCO, APIGee, Mule Digital Commerce – SalesForce, Adobe Experience Manager Digital Process Automation - PEGA, Appian, Camunda, Unqork, UIPath MEAN / MERN stacks Business Intelligence – SQL Server, ETL using SQL Server, Analysis using SQL Server, Enterprise reporting using SQL, Visualization Data Science – Python for data science, Machine learning, Exploratory data analysis, Statistics & Probability Cloud & Infrastructure Management – Network administration / Database administration / Windows administration / Linux administration / Middleware administration / End User Computing / ServiceNow Cloud platforms like AWS / GCP/ Azure / Oracle Cloud, Virtualization Cybersecurity - Infra Security / Identity & Access Management / Application Security / Governance & Risk Compliance / Network Security Mainframe – COBOL, DB2, CICS, JCL Open source – Python, PHP, Unix / Linux, MySQL, Apache, HTML5, CSS3, JavaScript DBMS – Oracle / SQL Server / MySQL / DB2 / NoSQL Design patterns, Agile, DevOps

Posted 1 week ago

Apply

3.0 - 8.0 years

30 - 35 Lacs

Bengaluru

Work from Office

This role is for a Senior Data Engineer - AIML, with a strong development background, whose primary objective will be to contribute developing and operationalizing platform services and large-scale machine learning pipelines at global scale. We are seeking for a talented professional with a solid mix of experience in Big Data and AI/ML systems. The ideal candidate for this role will have the ability to learn quickly and deliver solutions within strict deadlines in a fast-paced environment. They should have a passion for optimizing existing solutions and making incremental improvements. Strong interpersonal and effective communication skills, both written and verbal, are essential. Additionally, they should have knowledge of Agile methodologies, common scrum practices and tools. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Basic Qualifications 3+ years of relevant work experience and a Bachelors degree, OR 5+ years of relevant work experience Strong development experience in Python and one or more of the following programming languages: Go, Rust, Java/Scala, C/C++ Preferred Qualifications Strong knowledge of Big Data such as Hadoop, Spark, Kafka, Redis, Flink, Airflow and similar technologies Hands on experience with virtualization, containerization and utilizing distributed compute frameworks like Ray etc. Knowledge about DR / HA topologies with hands on experience in implementing the same Hands on experience in building and maintaining data and model engineering pipelines, feature engineering pipelines and comfortable with core ML concepts and frameworks Well versed with common software development tools, DevOps tools and Agile methodologies. Strong interpersonal and effective communication skills both written and verbal Payment domain experience with AIML applications background is a plus

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Immediate need to hire Java Developers for a FinTech Global client. Java, Spring Boot and REST Oracle DB Good knowledge of data structures and algorithm concepts At least 7 years of experience in software product development. Bachelor/ Master degree in Computer Science, Engineering, closely related quantitative discipline. Expertise in online payments and related domains is a plus. Requirements: Strong skills in Java, Scala, Spark & Raptor and OO-based design and development. Strong skills in Spring Boot, Hibernate, REST, Maven, GitHub, and other open-source Java libraries. Excellent problem-solving abilities and strong understanding of software development/ delivery lifecycle. Proven track record working with real-world projects and delivering complex software projects from concept to production, with a focus on scalability, reliability, and performance. Good knowledge of data structures and algorithm concepts, as well as database design, tuning and query optimization. Strong debugging and problem resolution skills and focus on automation, and test-driven development. Ability to work in a fast paced, iterative development environment. Hands on development experience using JAVA, Spring Core and Spring Batch. Deep understanding of and extensive experience applying advanced object-oriented design and development principles. Experience developing data-driven applications using an industry standard RDBMS (Oracle, etc.), including strong data architecture and SQL development skills. Knowledge on data modelling skills with relational databases, elastic search (Kibana), Hadoop. Experience with REST API’s, Web Services, JMS, Unit Testing and build tools. Responsibilities Team member will be expected to adhere to SDLC process and interact with the team on a daily basis. Develops efficient, elegant, clean, reusable code with no unnecessary complication or abstraction. Manages workload and other assignments efficiently while being able to resolve time-critical situations reliably and professionally. Work with various PD teams on integration and post-integration (live) issues. Engage in the automation of daily activities that drive operational excellence and ensure highly productive operating procedures. Weekend and after-hours support are required for BCDC products and applications on the live site, on a rotating schedule. Interview process includes coding rounds.

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

About Us Maersk is a global leader in integrated logistics and has been an industry pioneer for over a century. Through innovation and transformation, we are redefining the boundaries of possibility, continuously setting new standards for efficiency, sustainability, and excellence. With over 100,000 employees across 130 countries, we work together to shape the future of global trade and logistics. Join us as we harness cutting-edge technologies and unlock opportunities on a global scale. Together, let's sail towards a brighter, more sustainable future with Maersk. Are you passionate about driving innovation and upskilling teams in the world of data analytics and reporting? Join our dynamic team as a C onsultant – Reporting and Technology Enablement and play a pivotal role in enhancing our reporting capabilities while adopting cutting-edge technologies like Databricks. This is a unique opportunity to contribute to the development and success of finance reporting solutions for both headquarter and frontline teams. About The Role Success in this role will be defined by the ability to deliver impactful results, including increasing the number of automated reports, driving the adoption of innovative technologies, and reduction of the time and cost spent on reporting processes. As a consultant, you will focus on strengthening the technical capabilities of our reporting teams, leading impactful projects, and introducing innovative tools and methodologies. You will collaborate closely with the report development teams to deliver high-quality solutions while automating processes and ensuring efficiency across our financial reporting landscape. Key Responsibilities Team Upskilling and Mentorship Deliver targeted training sessions to enhance the skills of the reporting team in tools such as Power BI, Excel, Power Query, SQL, and Python. Mentor team members and share best practices to ensure the team’s success in supporting the finance organization. End-to-End Project Ownership Lead the design, development, and delivery of reporting and analytics projects tailored to the needs of HQ and frontline finance teams. Manage all phases of project development, including gathering requirements, data modeling, visualization design, testing, and deployment. Engage with stakeholders on a project basis to ensure successful outcomes. Technology Adoption and Innovation Drive the adoption and integration of new technologies, such as Databricks, into reporting workflows to enhance data processing and analytics capabilities. Evaluate and recommend tools and solutions to improve reporting efficiency and enable advanced financial analytics. Serve as a subject matter expert for Power BI, Databricks, SQL, Python and emerging technologies. Automation and Maintenance Support Collaborate with the maintenance/run teams to automate and streamline the refresh and maintenance of reports, leveraging SQL and Python for optimized processes. Develop scalable solutions to improve the sustainability of reporting infrastructure. Troubleshoot and resolve technical issues, ensuring minimal disruption to operations. What We’re Looking For Expertise in Power BI, Excel and Power Query with a strong focus on financial reporting and Business Intelligence (BI). You have experience writing scripts in SQL, Python, Scala, R, DAX and MDX. Proficiency in using Databricks, Dremio and other data technology platforms for advanced analytics and reporting. Experience in report automation and data pipeline optimization. Strong communication, problem-solving, and project management skills. A proactive and collaborative mindset, with the ability to work independently and in teams. Qualifications Master’s degree in finance, Engineering, Technology, or a related field. Background in finance, data analytics, or business intelligence. Prior experience in training or upskilling teams. Familiarity with Agile or similar project management methodologies. What We Offer An opportunity to work with a forward-thinking team driving innovation in reporting. A supportive environment for professional growth and development. A chance to work with advanced technologies and make a tangible impact on financial reporting and performance management processes. Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing accommodationrequests@maersk.com.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At eBay, we're more than a global ecommerce leader — we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts. Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work — every day. We're in this together, sustaining the future of our customers, our company, and our planet. Join a team of passionate thinkers, innovators, and dreamers — and help us connect people and build communities to create economic opportunity for all. Join Us to Redefine the User Experience at eBay! We are the eBay Payments Experience team, and we’re on a mission to reimagine how our seller community engages with their finances. Our goal is to deliver experiences that help sellers better understand and handle their financial journey—while ensuring seamless, intuitive interactions that boost satisfaction and trust. We’re innovating at a rapid pace, and we encourage individuals who are motivated by solving new challenges and crafting impactful solutions. About The Role As a Fullstack Software Engineer, you’ll team up with designers, product managers, and cross-functional engineers (including frontend, backend, native, machine learning, and quality specialists) to transform payments experience. The ideal candidate brings excellence in both frontend and backend development, cares deeply about user experience, moves with agility in AI-driven environments, and balances urgency with a commitment to quality. What You’ll Accomplish Lead a functional domain in designing cross-data-center, distributed, fault-tolerant, highly available, and high-performance web services and applications. Apply strong software architecture, object-oriented analysis/design, and problem-solving skills. Estimate engineering efforts, plan and implement system changes, and proactively identify and mitigate technical risks. Participate in design discussions, code reviews and project related team meetings. Work with other engineers, Architecture, Product Management, QA, and Operations teams to develop innovative solutions that meet business needs with respect to functionality, performance, scalability, reliability, realistic implementation schedules and adherence to development principles and product goals. Develop technical & domain expertise and apply to solving product challenges. What You’ll Bring Bachelor’s or Master’s degree with 7+ years of professional experience. Hands-on experience building large-scale distributed systems. Strong proficiency in Node.js, JavaScript, Java/Scala, and relational/NoSQL databases. Expertise in React, and backend frameworks. Deep understanding of data structures, algorithms, and design patterns with a consistent track record of developing scalable web and RESTful applications. Experience in native development (iOS/Android) and building with GraphQL is highly desirable Familiarity with AI technologies like workflow design or Retrieval-Augmented Generation (RAG) is a plus. At eBay, we value diversity of thought, collaborative innovation, and inclusive excellence. Join us in shaping the future of payments and empowering sellers across the globe. Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay. eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay's commitment to ensuring digital accessibility for people with disabilities. The eBay Jobs website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information.

Posted 1 week ago

Apply

4.0 - 9.0 years

16 - 25 Lacs

Navi Mumbai, Bengaluru, Mumbai (All Areas)

Hybrid

Role & responsibilities: Design and implement scalable data pipelines for feature extraction, transformation, and loading (ETL) using technologies such as Pyspark, Scala, and relevant big data frameworks. Govern and optimize data pipelines to ensure high reliability, efficiency, and data quality across on-premise and cloud environments. Collaborate closely with data scientists, ML engineers, and business stakeholders to understand data requirements and translate them into technical solutions. Implement best practices for data governance, metadata management, and compliance with regulatory requirements. Lead a team of data engineers, providing technical guidance, mentorship, and fostering a culture of innovation and collaboration. Stay updated with industry trends and advancements in big data technologies and contribute to the continuous improvement of our data engineering practices. Preferred candidate profile Strong experience in data engineering with hands-on experience in designing and implementing data pipelines. Strong proficiency in programming languages such as Pyspark and Scala, with experience in big data technologies (Cloudera, Hadoop ecosystem). Proven leadership experience in managing and mentoring a team of data engineers. Experience working in a banking or financial services environment is a plus. Excellent communication skills with the ability to collaborate effectively across teams and stakeholders.

Posted 1 week ago

Apply

6.0 - 9.0 years

25 - 32 Lacs

Bangalore/Bengaluru

Work from Office

Full time with top German MNC for location Bangalore - Experience on SCALA/Java is a must Job Description As a Data engineer in our team, you work with large scale manufacturing data coming from our globally distributed plants. You will focus on building efficient, scalable & data-driven applications. The data sets produced by these applications whether data streams or data at rest need to be highly available, reliable, consistent and quality-assured so that they can serve as input to wide range of other use cases and downstream applications. We run these applications on Azure databricks, you will be building applications, you will also contribute to scaling the platform including topics such as automation and observability. Finally, you are expected to interact with customers and other technical teams e.g. for requirements clarification & definition of data models. Primary responsibilities: • Be a key contributor to the Org hybrid cloud data platform (on-prem & cloud) Designing & building data pipelines on a global scale, ranging from small to huge datasets Design applications and data models based on deep business understanding and customer requirements Directly work with architects and technical leadership to design & implement applications and / or architectural components Architectural proposal and estimation for the application, technical leadership to the team Coordination/Collaboration with central teams for tasks and standards Develop data integration workflow in Azure Developing streaming application using scala. Integrating the end-to-end Azure Databricks pipeline to take data from source systems to target system ensuring the quality and consistency of data. Defining data quality and validation checks. Configuring data processing and transformation. Writing unit test cases for data pipelines. Defining and implementing data quality and validation check. Tuning pipeline configurations for optimal performance. Participate in Peer review and PR review for the code written by team members Qualifications Bachelors degree in computer science, Computer Engineering, relevant technical field, or equivalent; Master’s degree preferred. Additional Information Skills Based on deep technical expertise, capable of working directly with architects and technical leadership Able to guide junior team members in technical questions related to architecture or software & system design Self-starter and empowered professional with strong execution and communication capabilities Proactive mindset: identify and start work independently, challenges status quo, accepts being challenged Outstanding written and verbal communication skills. Key Competencies: 6+ years’ experience in data engineering, ETL tools and working with large data sets. Minimum 5 years of working experience of distributed cluster. At least 5 years of experience in Scala/Java software development. At least 2-3 years of Azure Databricks Cloud experience in Data Engineering Experience of Delta table, ADLS, DBFS, ADF. Deep level of understanding in distributed systems for data storage and processing (e.g. Kafka, Spark, Azure Cloud) Experience with Cloud based SQL Database: Azure SQL Editor Excellent software engineering skills (i.e., data structures, algorithms, software design). Excellent problem-solving, investigative, and troubleshooting skills Experience with CI/CD tools such as Jenkins and Github Ability to work independently. Soft Skills: Good Communication Skills Ability to coach and Guide young Data Engineers Decent Level in English as Business Language

Posted 1 week ago

Apply

3.0 - 8.0 years

7 - 10 Lacs

Kolkata

Work from Office

Job Title: Data Engineer Department : Information Technology Reporting To : AVP IT Location : Kolkata Position Summary We are seeking a highly skilled and motivated Data Engineer to design and drive data-centric initiatives, contributing to strategic business decision-making. The ideal candidate will have a robust understanding of Big Data , Machine Learning , Cloud Computing , and modern data pipeline architectures . This role is critical in transforming raw data into meaningful insights and supporting a scalable analytics ecosystem. Key Responsibilities Design, develop, and implement robust data pipelines for seamless data ingestion, transformation, and storage. Build and deploy Machine Learning models to deliver actionable insights and predictive analytics. Conduct Exploratory Data Analysis (EDA) on structured and unstructured data. Optimize and manage big data environments ensuring performance, scalability, and reliability. Collaborate closely with cross-functional teams (engineering, product, business) to deliver data-driven solutions aligned with business goals. Uphold industry standards in Data Governance , Security , and Compliance . Tune performance and optimize SQL/NoSQL databases and cloud-based storage systems . Lead data visualization initiatives, transforming analytical outputs into digestible and actionable dashboards or reports. Collect, clean, and analyze large datasets from multiple sources. Develop and maintain dashboards, visualizations, and reports for business tracking. Extract, manipulate, and analyze data using SQL, Python, or R . Identify trends, anomalies, and insights that impact business performance. Conduct market research , customer behavior analysis , and performance tracking . Ensure data integrity and maintain compliance with data quality standards and policies. Required Qualifications & Experience Bachelors or Masters degree in Computer Science, Data Science, Statistics , or a related field. 5+ years of hands-on experience in Data Engineering or related domains. Proficiency in programming languages such as Python , SQL , Scala , or Java . Deep knowledge of Big Data tools : Apache Spark, Hadoop , and frameworks like Apache Airflow, Kafka . Experience in Cloud Platforms : AWS , Azure , or Google Cloud Platform (GCP) . Practical experience with Machine Learning libraries/frameworks : TensorFlow, PyTorch, Scikit-learn . Skilled in data visualization tools : Tableau, Power BI , or similar platforms. Strong analytical mindset with excellent problem-solving and communication skills. Immediate Joinee Preferred Preferred Skills (Good to Have) Familiarity with DevOps practices and containerization tools such as Docker and Kubernetes . Exposure to real-time data streaming and analytics. Knowledge in advanced fields like Deep Learning , Natural Language Processing (NLP) , or Computer Vision . If interested please call at +91-6292121331 or mail your updated CV at sushmita.dr@vgmconsult.co.in for further discussion & interview.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies