Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 9.0 years
8 - 14 Lacs
Mysuru
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
7.0 - 9.0 years
8 - 14 Lacs
Kochi
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 1 week ago
7.0 - 8.0 years
32 - 45 Lacs
Pune
Work from Office
We are looking to add an experienced and enthusiastic Lead Data Scientist to our Jet2 Data Science team in India. Reporting to the Data Science Delivery Manager , the Lead Data Scientist is a key appointment to the Data Science Team , with responsibility for executing the data science strategy and realising the benefits we can bring to the business by combining insights gained from multiple large data sources with the contextual understanding and experience of our colleagues across the business. In this exciting role, y ou will be joining an established team of 40+ Data Science professionals , based across our UK and India bases , who are using data science to understand, automate and optimise key manual business processes, inform our marketing strategy, and ass ess product development and revenue opportunities and optimise operational costs. As Lead Data Scientist, y ou will have strong experience in leading data science projects and creating machine learning models and be able t o confidently communicate with and enthuse key business stakeholders . Roles and Responsibilities A typical day in your role at Jet2TT: A lead data scientist would lead a team of data science team Lead will be responsible for delivering & managing day-to-day activities The successful candidate will be highly numerate with a statistical background , experienced in using R, Python or similar statistical analysis package Y ou will be expected to work with internal teams across the business , to identify and collaborate with stakeholders across the wider group. Leading and coaching a group of Data Scientists , y ou will plan and execute the use of machine learning and statistical modelling tools suited to the identified initiative delivery or discovery problem identified . You will have strong ability to analyse the create d algorithms and models to understand how changes in metrics in one area of the business could impact other areas, and be able to communicate those analyses to key business stakeholders. You will identify efficiencies in the use of data across its lifecycle, reducing data redundancy, structuring data to ensure efficient use of time , and ensuring retained data/information provides value to the organisation and remains in-line with legitimate business and/or regulatory requirements. Your ability to rise above group think and see beyond the here and now is matched only by your intellectual curiosity. Strong SQL skills and the ability to create clear data visualisations in tools such as Tableau or Power BI will be essential . They will also have experience in developing and deploying predictive models using machine learning frameworks and worked with big data technologies. As we aim to realise the benefits of cloud technologies, some familiarity with cloud platforms like AWS for data science and storage would be desirable. You will be skilled in gathering data from multiple sources and in multiple formats with knowledge of data warehouse design, logical and physical database design and challenges posed by data quality. Qualifications, Skills and Experience (Candidate Requirements): Experience in leading small to mid-size data science team Minimum 7 years of experience in the industry & 4+ experience in data science Experience in building & deploying machine learning algorithms & detail knowledge on applied statistics Good understanding of various data architecture RDBMS, Datawarehouse & Big Data Experience of working with regions such as US, UK, Europe or Australia is a plus Liaise with the Data Engineers, Technology Leaders & Business Stakeholder Working knowledge of Agile framework is good to have Demonstrates willingness to learn Mentoring, coaching team members Strong delivery performance, working on complex solutions in a fast-paced environment
Posted 1 week ago
3.0 - 6.0 years
8 - 13 Lacs
Bengaluru
Work from Office
KPMG India is looking for Azure Data Engieer - Assistant Manager Azure Data Engieer - Assistant Manager to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 1 week ago
3.0 - 4.0 years
17 - 18 Lacs
Bengaluru
Work from Office
KPMG India is looking for Azure Data Engineer - Consultant Azure Data Engineer - Consultant to join our dynamic team and embark on a rewarding career journey Assure that data is cleansed, mapped, transformed, and otherwise optimised for storage and use according to business and technical requirements Solution design using Microsoft Azure services and other tools The ability to automate tasks and deploy production standard code (with unit testing, continuous integration, versioning etc.) Load transformed data into storage and reporting structures in destinations including data warehouse, high speed indexes, real-time reporting systems and analytics applications Build data pipelines to collectively bring together data Other responsibilities include extracting data, troubleshooting and maintaining the data warehouse
Posted 1 week ago
19.0 - 24.0 years
11 - 15 Lacs
Noida
Work from Office
We are looking for a highly skilled and experienced AI Leader to join our team at Opkey, with 19 years of experience in the field. The ideal candidate will have a strong background in artificial intelligence and machine learning. Roles and Responsibility Develop and implement AI and ML models to solve complex problems. Lead a team of data scientists and engineers to design and deploy AI solutions. Collaborate with cross-functional teams to identify business needs and develop AI solutions. Stay updated with industry trends and advancements in AI and ML. Design and implement data pipelines and architectures to support AI model development. Work closely with stakeholders to understand requirements and deliver high-quality results. Job Requirements Strong knowledge of AI and ML concepts, including machine learning algorithms and deep learning techniques. Experience with programming languages such as Python, Java, or C++. Strong understanding of software development principles and methodologies. Excellent leadership and communication skills. Ability to work in a fast-paced environment and adapt to changing priorities. Strong problem-solving skills and attention to detail.
Posted 1 week ago
4.0 - 8.0 years
6 - 11 Lacs
Pune
Work from Office
Key Responsibilities:Design, develop, and maintain advanced dashboards and reports in Tableau that provide insights into user engagement, product performance, and business metrics.Utilize Databricks to process, clean, and analyze large volumes of data efficiently.Develop complex SQL queries and optimize data pipelines for analytics workflows.Collaborate with cross-functional teams to define key metrics, data requirements, and analytical approaches.Conduct deep-dive analyses and present findings with clear data storytelling to influence product and business strategies.Automate routine reporting and data validation tasks to ensure accuracy and timeliness.Mentor junior analysts and promote best practices in data visualization and analytics.Stay current with latest trends and tools in data analytics, visualization, and big data platforms.
Posted 1 week ago
3.0 - 8.0 years
20 - 25 Lacs
Bengaluru
Work from Office
The Amazon Currency Converter team is responsible for developing the platform and applications used to introduce new and innovative payment methods to customers. We help Amazon expand globally by providing platform for FX (Foreign Exchange) and enabling payments in multiple currencies. The technology we build and operate varies widely, ranging from large scale Distributed Engineering incorporating the latest from Machine Learning in the Big Data space to customer and mobile friendly User Experiences. We are an agile team, moving quickly in collaboration with our business to bring new features to millions of Amazon customers while having fun and filing new inventions along the way. If you can think big and want to join a fast moving team breaking new ground at Amazon we would like to speak with you! Collaborate with experienced cross-disciplinary Amazonians to develop, design, and bring to market innovative devices and services Design and build innovative technologies in a large distributed computing environment and help lead fundamental changes in the industry Create solutions to run predictions on distributed systems with exposure to technologies at incredible scale and speed Build distributed storage, index, and query systems that are scalable, fault-tolerant, low cost, and easy to manage/use 3+ years of non-internship professional software development experience 2+ years of non-internship design or architecture (design patterns, reliability and scaling) of new and existing systems experience Experience programming with at least one software programming language 3+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience Bachelors degree in computer science or equivalent
Posted 1 week ago
4.0 - 8.0 years
7 - 12 Lacs
Noida
Work from Office
Technical Expertise Solid Python programming (OOPS, REST, API), SQL & Linux experience, handling ODBC/JDBC/Arrowflight connections. Have done projects handling large volume of data, implemented MMP tools and strong knowledge of storage solutions (NAS, S3, HDFS) and data formats (parquet, avro, iceberg) Strong knowledge of Kubernetes, containerization, Helm chart, templates and overlay, Vault, SSL/TSL, KB stores Prior knowledge of key libraries e.g. S3, MongoDB, Elastic, Trident NAS, Grafana etc will be big plus. Used Git and built CI-CD pipelines in recent projects. Additional Criteria Proactive in asking questions, indulge with team in group conversations, actively participate in issues discussions, share inputs/ideas/suggestions. Candidate needs to be responsive, actively pick new tasks/issues by him/herself without much mentoring/monitoring. NO REMOTELY located candidate. Mandatory Competencies DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Docker Beh - Communication Programming Language - Python - Django Big Data - Big Data - Flask DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Containerization (Docker, Kubernetes) Data Science and Machine Learning - Data Science and Machine Learning - Python Middleware - API Middleware - API (SOAP, REST) Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - AWS - Amazon Elastic Container Registry (ECR), AWS Elastic Kubernetes Service (EKS)
Posted 1 week ago
5.0 - 9.0 years
12 - 17 Lacs
Noida
Work from Office
Spark/PySpark Technical hands on data processing Table designing knowledge using Hive - similar to RDBMS knowledge Database SQL knowledge for retrieval of data - transformation queries such as joins (full, left, right), ranking, group by Good Communication skills. Additional skills - GitHub, Jenkins, shell scripting would be added advantage Mandatory Competencies Big Data - Big Data - Pyspark Big Data - Big Data - SPARK Big Data - Big Data - Hadoop Big Data - Big Data - HIVE DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Jenkins Beh - Communication and collaboration Database - Database Programming - SQL DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Basic Bash/Shell script writing
Posted 1 week ago
2.0 - 5.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Analyze large datasets to identify trends and shape strategic decisions. Develop statistical models to solve complex problems with user-centric solutions. Create impactful dashboards and reports to track key performance indicators (KPIs). Collaborate with cross-functional teams to align analytical support with product innovation. Perform exploratory data analysis (EDA) to uncover insights for decision-making. Present findings using compelling storytelling to simplify complex data. Ensure data integrity through validation and cleaning processes. Design and optimize ETL workflows for seamless data processing. Utilize PySpark for big data analysis and leverage cloud platforms (AWS, GCP, Azure) for storage and computation.
Posted 1 week ago
4.0 - 8.0 years
10 - 14 Lacs
Gurugram
Work from Office
The ideal candidate will have a strong background in data engineering and excellent problem-solving skills. Roles and Responsibility Design, develop, and implement large-scale data pipelines and architectures. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain complex data systems and databases. Ensure data quality, integrity, and security. Optimize data processing workflows for improved performance and efficiency. Troubleshoot and resolve technical issues related to data engineering. Job Requirements Strong knowledge of data engineering principles and practices. Experience with data modeling, database design, and data warehousing. Proficiency in programming languages such as Python, Java, or C++. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.
Posted 1 week ago
5.0 - 10.0 years
11 - 15 Lacs
Chennai
Work from Office
Project description You'll be working in the GM Business Analytics team located in Pune. The successful candidate will be a member of the global Distribution team, which has team members in London and Pune. We work as part of a global team providing analytical solutions for IB distribution/sales people. Solutions deployed should be extensible globally with minimal localization. Responsibilities Are you passionate about data and analyticsAre you keen to be part of the journey to modernize a data warehouse/ analytics suite of application(s). Do you take pride in the quality of software delivered for each development iteration We're looking for someone like that to join us and be a part of a high-performing team on a high-profile project. solve challenging problems in an elegant way master state-of-the-art technologies build a highly responsive and fast updating application in an Agile & Lean environment apply best development practices and effectively utilize technologies work across the full delivery cycle to ensure high-quality delivery write high-quality code and adhere to coding standards work collaboratively with diverse team(s) of technologists You are Curious and collaborative, comfortable working independently, as well as in a team Focused on delivery to the business Strong in analytical skills. For example, the candidate must understand the key dependencies among existing systems in terms of the flow of data among them. It is essential that the candidate learns to understand the 'big picture' of how IB industry/business functions. Able to quickly absorb new terminology and business requirements Already strong in analytical tools, technologies, platforms, etc. The candidate must also demonstrate a strong desire for learning and self-improvement. Open to learning home-grown technologies, support current state infrastructure and help drive future state migrations. imaginative and creative with newer technologies Able to accurately and pragmatically estimate the development effort required for specific objectives You will have the opportunity to work under minimal supervision to understand local and global system requirements, design and implement the required functionality/bug fixes/enhancements. You will be responsible for components that are developed across the whole team and deployed globally. You will also have the opportunity to provide third-line support to the application's global user community, which will include assisting dedicated support staff and liaising with the members of other development teams directly, some of which will be local and some remote. Skills Must have A bachelor's or master's degree, preferably in Information Technology or a related field (computer science, mathematics, etc.), focusing on data engineering. 5+ years of relevant experience as a data engineer in Big Data is required. Strong Knowledge of programming languages (Python / Scala) and Big Data technologies (Spark, Databricks or equivalent) is required. Strong experience in executing complex data analysis and running complex SQL/Spark queries. Strong experience in building complex data transformations in SQL/Spark. Strong knowledge of Database technologies is required. Strong knowledge of Azure Cloud is advantageous. Good understanding and experience with Agile methodologies and delivery. Strong communication skills with the ability to build partnerships with stakeholders. Strong analytical, data management and problem-solving skills. Nice to have Experience working on the QlikView tool Understanding of QlikView scripting and data model
Posted 1 week ago
14.0 - 16.0 years
25 - 30 Lacs
Pune
Work from Office
Role Description : Advanced AI and GenAI for Problem Discovery, RCA and Action Recommendation : Lead the data scientists in hands-on mode developing and deploying advanced machine learning models, including anomaly detection, predictive analytics, causal inferencing and action recommendation models, to autonomously discover problems and identify root causes and recommend actions in real-time and batch processing scenarios Gen-AI-Powered Analysis and Insights Generation: Lead the data scientists in hands-on mode creating and fine-tuning Generative AI models to assist in analysis creation for problem discovery, translating complex data patterns and root cause findings into actionable, natural language insights for business stakeholders. Implement prompt engineering and fine-tuning techniques to enhance the relevance and accuracy of insights. Reference Architectures : Work with Technical Architects to evolve frameworks for augmenting and evolving batch processing data architectures with streaming architectures to support connected data and ML pipeline executions in real-time. Client Deployments of the Platform: Consult and provide guidance for creating automated data pipelines for raw data and engineered features ensuring data quality, integrity, and accessibility for model training and inference. Development of Use Cases: Lead and support development of the use cases on the platform for various vertical specific problem statements. Leadership and Collaboration: Lead and mentor a team of data scientists and machine learning engineers, Pre-sales: Support pre-sales, marketing hyper-scaler partnerships and other sales activities such as RFPs by providing subject matter expertise during scoping and designing of the engagements. Technical Skills : In-depth conceptual understanding of Statistics, Classical Machine Learning, Deep Learning and GenAI Able to understand the nuances of business problems in various domains, quickly grasp the problem discovery analysis and modelling imperatives and translate those into the requirements for AI models Significant exposure to various proprietary & open-source cloud-based Machine Learning platforms such as Amazon Sagemaker / Azure Machine Learning Studio / Google Datalab ML Engine AutoML / H2O etc. Experience in leveraging open-source LLMs and prompt engineering and RAG based SLM model development Hands-on with excellent ML programming skills in Python using open-source libraries Qualifications Experience in working with RDBMS and NoSQL databases; Exposure to Big Data technologies 14-16 years of Machine Learning model development and application experience Experience in building cloud-based products that implement ML modelling as a service would be ideal
Posted 1 week ago
4.0 - 8.0 years
8 - 12 Lacs
Pune
Work from Office
Piller Soft Technology is looking for Lead Data Engineer to join our dynamic team and embark on a rewarding career journey Designing and developing data pipelines: Lead data engineers are responsible for designing and developing data pipelines that move data from various sources to storage and processing systems. Building and maintaining data infrastructure: Lead data engineers are responsible for building and maintaining data infrastructure, such as data warehouses, data lakes, and data marts. Ensuring data quality and integrity: Lead data engineers are responsible for ensuring data quality and integrity, by setting up data validation processes and implementing data quality checks. Managing data storage and retrieval: Lead data engineers are responsible for managing data storage and retrieval, by designing and implementing data storage systems, such as NoSQL databases or Hadoop clusters. Developing and maintaining data models: Lead data engineers are responsible for developing and maintaining data models, such as data dictionaries and entity-relationship diagrams, to ensure consistency in data architecture. Managing data security and privacy: Lead data engineers are responsible for managing data security and privacy, by implementing security measures, such as access controls and encryption, to protect sensitive data. Leading and managing a team: Lead data engineers may be responsible for leading and managing a team of data engineers, providing guidance and support for their work.
Posted 1 week ago
4.0 - 7.0 years
12 - 16 Lacs
Bengaluru
Work from Office
A Software Engineer delivers features for a Product in a business chain. As member of a Feature Team, he works in autonomy and in a continuous improvement approach. Generic Skills : Requirement analysis, Should understand user stories Development Develop and write Unit Test with the supervision of Senior Developer. Follow coding standards, Follow clean coding and craftmanship practices, Should know how adopt tools in his projects, Full Stack Development Skills, Developer Understands at least one backend, front end technologies to work on. Development defined by leads. Core skills -Proficiency in core java, knowledge of libraries and frameworks (sprint, hibernate, sprintboot), -Core speciality in front-end development (react, angular, etc.), familarity of backend architecture, database, microservices and UI design patterns -Strong experience in designing and building scalable backend systems and reliable API's, -Working knowledge on performing software upgrades - JDK, Tomcat -Database knowledge (SQL, bigdata etc.), -Understanding of cloud platforms (azure, SG tools), -Knowledge of containerizeration and docker platforms like kubernetes, CI/CD/devops experience, -Perform IUT and E2E testing and be able to support Production -Conversant in Agile methodologies -Should come up with ideas and proposals on tech solutions - Test automation knowledge is desirable Soft skills: - Good problem solving skills, - Proficient in communicating and accountability of tasks
Posted 1 week ago
10.0 - 15.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Should have good level of expertise in development in Java stack (J2EE, Core Java, Spring, Springboot, Hibernate Should have good level of expertise in BigData technologies Should have good level of expertise in working on Rest services Strong knowledge in OOA (Object Oriented Analysis) OOD (Object Oriented Design), J2EE Design Patterns Strong experience in Agile methodologies Expertise with Oracle and exposure to PostgreSQL Profile required 10+ Years of Experience in Design Principles Patterns, Development, Review Must have good experience in Java stack (J2EE, Core Java, Spring, Springboot, Hibernate Must have experience in working with BigData technologies Must have Good experience working on Rest services Strong knowledge in OOA (Object Oriented Analysis) OOD (Object Oriented Design), J2EE Design Patterns Strong experience in Agile methodologies Expertise with Oracle and exposure to PostgreSQL Experience in test driven development (unit testing. Mocking, BDD)
Posted 1 week ago
10.0 - 15.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Should have good level of expertise in development in Java stack (J2EE, Core Java, Spring, Springboot, Hibernate Should have good level of expertise in working with UNIX and Shell scripting Should have good level of expertise in working on Rest services Strong experience of working on BigData technologies Strong knowledge in OOA (Object Oriented Analysis) OOD (Object Oriented Design), J2EE Design Patterns Strong experience in Agile methodologies Expertise with Oracle and exposure to PostgreSQL Profile required 10+ Years of Experience in Design Principles Patterns, Development, Review Must have good experience in Java stack (J2EE, Core Java, Spring, Springboot, Hibernate Must have experience in BigData technologies Must have Good experience working on Rest services Strong knowledge in OOA (Object Oriented Analysis) OOD (Object Oriented Design), J2EE Design Patterns Strong experience in Agile methodologies Expertise with Oracle and exposure to PostgreSQL Experience in test driven development (unit testing. Mocking, BDD)
Posted 1 week ago
4.0 - 9.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Minimum of 4+ years of software development experience with demonstrated expertise in standard development best practice methodologies SKILLS REQUIRED: Spark, Scala, Python, HDFS, Hive, , Scheduler ( Ozzie, Airflow),Kafka Spark/Scala SQL RDBMS DOCKER KUBERNETES RABBITMQ/KAFKA MONITORING TOOLS - SPLUNK OR ELK Profile required Integrate test frameworks in development process Refactor existing solutions to make it reusable and scalable - Work with operations to get the solutions deployed Take ownership of production deployment of code Collaborating with and/or lead cross functional teams, build and launch applications and data platforms at scale, either for revenue generating or operational purposes *Come up with Coding and Design best practices *Thrive in self-motivated internal-innovation driven environment Adapting fast to new application knowledge and changes
Posted 1 week ago
4.0 - 6.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Design, development and testing of components / modules in TOP (Trade Open Platform) involving Spark, Java, Hive and related big-data technologies in a datalake architecture Contribute to the design, development and deployment of new features new components in Azure public cloud Contribute to the evolution of REST APIs in TOP enhancement, development and testing of new APIs Ensure the processes in TOP provide an optimal performance and assist in performance tuning and optimization Release Deployment Deploy using CD/CI practices and tools in various environments development, UAT and production and follow production processes. Ensure Craftsmanship practices are followed Follow Agile at Scale process in terms of participation in PI Planning and follow-up, Sprint planning, Back-log maintenance in Jira. Organize training sessions on the core platform and related technologies for the Tribe / Business line to ensure the platform evolution is continuously updated to relevant stakeholders Profile required Around 4-6 years of experience in IT industry, preferably banking domain Expertise and experience in Java (java 1.8 (building API, Java thread, collections, Streaming, dependency injection/inversion), Junit, Big-data (Spark, Oozie, Hive) and Azure (AKS, CLI, Event, Key valut) and should have been part of digital transformation initiatives with knowledge of Unix, SQL/RDBMS and Monitoring Development experience in REST APIs Experience in managing tools GIT/BIT Bucket, Jenkins, NPM, Docket/Kubernetes, Jira, Sonar Knowledge of Agile practices and Agile at Scale Good communication / collaboration skills
Posted 1 week ago
4.0 - 7.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Design, development and testing of components / modules in TOP (Trade Open Platform) involving Spark, Java, Hive and related big-data technologies in a datalake architecture Contribute to the design, development and deployment of new features new components in Azure public cloud Contribute to the evolution of REST APIs in TOP enhancement, development and testing of new APIs Ensure the processes in TOP provide an optimal performance and assist in performance tuning and optimization Release Deployment Deploy using CD/CI practices and tools in various environments development, UAT and production and follow production processes. Ensure Craftsmanship practices are followed Follow Agile at Scale process in terms of participation in PI Planning and follow-up, Sprint planning, Back-log maintenance in Jira. Organize training sessions on the core platform and related technologies for the Tribe / Business line to ensure the platform evolution is continuously updated to relevant stakeholders
Posted 1 week ago
4.0 - 6.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Design, development and testing of components / modules in TOP (Trade Open Platform) involving Spark, Java, Hive and related big-data technologies in a datalake architecture Contribute to the design, development and deployment of new features new components in Azure public cloud Contribute to the evolution of REST APIs in TOP enhancement, development and testing of new APIs Ensure the processes in TOP provide an optimal performance and assist in performance tuning and optimization Release Deployment Deploy using CD/CI practices and tools in various environments development, UAT and production and follow production processes. Ensure Craftsmanship practices are followed Follow Agile at Scale process in terms of participation in PI Planning and follow-up, Sprint planning, Back-log maintenance in Jira. Organize training sessions on the core platform and related technologies for the Tribe / Business line to ensure the platform evolution is continuously updated to relevant stakeholders Around 4-6 years of experience in IT industry, preferably banking domain Expertise and experience in Java (java 1.8 (building API, Java thread, collections, Streaming, dependency injection/inversion), Junit, Big-data (Spark, Oozie, Hive) and Azure (AKS, CLI, Event, Key valut) and should have been part of digital transformation initiatives with knowledge of Unix, SQL/RDBMS and Monitoring Development experience in REST APIs Experience in managing tools – GIT/BIT Bucket, Jenkins, NPM, Docket/Kubernetes, Jira, Sonar Knowledge of Agile practices and Agile@Scale Good communication / collaboration skills
Posted 1 week ago
4.0 - 6.0 years
13 - 18 Lacs
Bengaluru
Work from Office
Design, development and testing of components / modules in TOP (Trade Open Platform) involving Spark, Java, Hive and related big-data technologies in a datalake architecture Contribute to the design, development and deployment of new features new components in Azure public cloud Contribute to the evolution of REST APIs in TOP enhancement, development and testing of new APIs Ensure the processes in TOP provide an optimal performance and assist in performance tuning and optimization Release Deployment Deploy using CD/CI practices and tools in various environments development, UAT and production and follow production processes. Ensure Craftsmanship practices are followed Follow Agile at Scale process in terms of participation in PI Planning and follow-up, Sprint planning, Back-log maintenance in Jira. Organize training sessions on the core platform and related technologies for the Tribe / Business line to ensure the platform evolution is continuously updated to relevant stakeholders Profile required Around 4-6 years of experience in IT industry, preferably banking domain Expertise and experience in Java (java 1.8 (building API, Java thread, collections, Streaming, dependency injection/inversion), Junit, Big-data (Spark, Oozie, Hive) and Azure (AKS, CLI, Event, Key valut) and should have been part of digital transformation initiatives with knowledge of Unix, SQL/RDBMS and Monitoring Development experience in REST APIs Experience in managing tools GIT/BIT Bucket, Jenkins, NPM, Docket/Kubernetes, Jira, Sonar Knowledge of Agile practices and Agile at Scale Good communication / collaboration skills
Posted 1 week ago
5.0 - 10.0 years
1 - 1 Lacs
Chennai, Bengaluru
Work from Office
Mid: 4- 6 Yrs 11 L Senior: 8+ Yrs17L Key Responsibilities: * Design, develop, and maintain robust data pipelines using big data technologies. * Optimize data processing and storage frameworks to ensure high performance and scalability. * Work with large-scale datasets using tools such as Hive, Spark, and HDFS. * Develop ETL processes to ingest and transform data from diverse sources. * Ensure data quality and integrity across all stages of the data pipeline. * Collaborate with cross-functional teams to understand data needs and deliver solutions. * Implement best practices for data engineering, security, and compliance. Required Skills: * Strong experience in Big Data ecosystems (Hadoop, HDFS, Hive, etc.) * Proficient in SQL for data querying and transformation. * Hands-on experience with Apache Hive for managing structured data in Hadoop. * Strong working knowledge of PySpark for distributed data processing. * Familiarity with data modeling, data warehousing concepts, and ETL processes. * Proficient in scripting languages (e.g., Python)
Posted 1 week ago
5.0 - 9.0 years
5 - 9 Lacs
Noida
Work from Office
5-9 years In Data Engineering, software development such as ELT/ETL, data extraction and manipulation in Data Lake/Data Warehouse environment Expert level Hands to the following: Python, SQL, PySpark DBT and Apache Airflow Postgres/others RDBMS DevOps, Jenkins, CI/CD Data Governance and Data Quality frameworks Data Lakes, Data Warehouse AWS services including S3, SNS, SQS, Lambda, EMR, Glue, Athena, EC2, VPC etc. Source code control - GitHub, VSTS etc. Mandatory Competencies Data on Cloud - Azure Data Lake (ADL) Python - Python Big Data - PySpark DevOps - CI/CD DevOps - Jenkins Beh - Communication Data on Cloud - AWS S3 Database - PostgreSQL DevOps - Github
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough