Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2 - 5 years
14 - 17 Lacs
Bengaluru
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 3-5 years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala"
Posted 2 months ago
2 - 5 years
14 - 17 Lacs
Mysore
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
2 - 7 years
4 - 8 Lacs
Chennai
Work from Office
? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities
Posted 2 months ago
5 - 10 years
8 - 12 Lacs
Pune
Work from Office
About The Role : Job Title - UI Developer Location- Pune, India Role Description As a Assistant Vice President for Technology in Cash Manager team, you will partner with business, technology managers and Risk and Control team to create and maintain the application stack and execute strategical programs. Youll be an integral part of the banks infrastructure, guiding the Corporate Bank Technology team through the engineering practices. You will enable the digital environment that helps our people share their knowledge, expertise and real passion for our business. Deutsche Bank is investing heavily in technology, which means we are investing in you. Join us here, and youll constantly be looking ahead. Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel. You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy. Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Accountable for engineering and leading functional deliveries in Corporate Bank Technology Deliver state of the art solutions and be a hands-on technologist Implement leading technology advancements in the industry Maintain hygiene, risk, control, and stability at the core of every delivery Manage the software development life cycle (SDLC) of software components all the way to production, including helping support the application to resolve production issues with appropriate triaging Partner with UI/UX Design Team, Architects, Business Analysts, and Stakeholders situated in multiple regions and time-zones; at varying capacity to understand requirements Your skills and experience Minimum of 5 years of frontend development experience using React JS, TypeScript, CSS, Node.JS and Redux Minimum of 3 years of experience developing Java based Microservices (preferably using Spring Boot framework) Minimum of 3 years of experience working with relational datastore such as Oracle and non-relational datastores such as MongoDB, HBase, etc Minimum of 3 years of experience in Cloud distributed computing especially with demonstrable use of Kubernetes, Docker, GIT, Maven Experience designing and evolving RESTful APIs Full understanding of SDLC lifecycle for software delivery including engaging with production support personnel for defect triaging and resolution Sound understanding of data structures and algorithms Demonstrated ability to perform 3rd level support as the developed components move to production Ability to solution/ support UI strategy, implement best practices Experience in Stakeholder Management, driving priorities, pro-active readiness towards quarterly delivery and managing delivery end to end from design interactions to development and production implementation Application development experience using Agile Methodology Experience with Test Driven Development (TDD) and writing automated tests Experience with Application performance Management Software such as New Relic, and/or Splunk is preferred Understanding of J2EE platform principles with experience in ability to review codebase and partner with product/functional teams to extract business rules for transformational initiatives is preferred Experience partnering with UI/UX Design Team, Architects, Business Analysts and Stakeholders situated in multiple regions and time-zones; at varying capacity to understand requirements Experience managing SDLC of software components all the way to production including helping support the application to resolve Timely collaboration with other leads, build franchise and people engagements to develop a positive work culture How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 2 months ago
6 - 11 years
9 - 14 Lacs
Hyderabad
Work from Office
Carelon Global Solutions India is seeking a Senior Software Engineer who will be responsible for all design requirements for our leader communications. The incumbent will report to Manager - Communications and must understand the design process, liaising with multiple stakeholders to understand the brief and deliver all in-house design requirements, apart from coordinating with external agencies to ensure brand guidelines are followed. JOB RESPONSIBILITY Experience with Big Data technologies will be a plus (Hadoop, Spark, Kafka, HBase, etc) Write SQL queries to validate the dashboard output Working experience with database environment - understanding relational database structure and hands-on SQL knowledge to extract/manipulate data for variance testing. Performing code reviews and pair programming Supporting and enhancing current applications Design, develop, test, and implement the application investigate and resolve complex issues while supporting existing applications. QUALIFICATION B.Tech /B.E /MCA EXPERIENCE 6+ year s experience in AWS Services: RDS, AWS Lambda, AWS Glue, Apache Spark, Kafka, Spark streaming, spark, Scala , Hive and AWS etc. 6+ year s experience SQL and NoSQL databases like MySQL, Postgres, Elasticsearch 6+ year s experience with Spark programming paradigms (batch and stream-processing) 6+ year s experience in Java, Scala. Familiarity with a scripting language like Python as well as Unix/Linux shells 6+ year s experience with Strong analytical skills and advanced SQL knowledge, indexing, query optimization techniques. SKILLS AND COMPETENCIES Profound understanding of Big Data core concepts and technologies - Apache Spark, Kafka, Spark streaming, spark, Scala , Hive and AWS etc. Solid experience and understanding of various core AWS services such as IAM, Cloud Formation, EC2, S3, EMR, Glue, Lambda, Athena, and Redshift. Data Engineer (Bigdata, Kafka, Spark streaming, spark, Scala and AWS). Experience in system analysis, design, development, and implementation of data ingestion pipeline in AWS. Programming experience with Python/Scala, Shell scripting. Experience with DevOps and Continuous Integration/Delivery (CI/CD) concepts and tools such as Bitbucket and Bamboo. Good understanding of business and operational processes. Capable of Problem / issue resolution, capable of thinking out of the box.
Posted 2 months ago
5 - 7 years
11 - 13 Lacs
Nasik, Pune, Nagpur
Work from Office
Euclid Innovations Pvt Ltd is looking for Data Engineer Drive to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 months ago
8 - 13 years
30 - 35 Lacs
Pune
Work from Office
Data Engineer1 Job Description Job Description Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Data Engineer - Knows HR Knowledge , all other requirement from Functional Area given by UBER Customer Name Customer Name uber
Posted 2 months ago
2 - 5 years
3 - 7 Lacs
Karnataka
Work from Office
EXP 4 to 6 yrs Location Any PSL Location Rate below 14$ JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic understanding of different open data formats Delta, Iceberg, Hudi Ability to engage in technical conversations and suggest enhancements to the current Architecture and design"
Posted 2 months ago
2 - 5 years
3 - 7 Lacs
Maharashtra
Work from Office
Description Overall 10+ years of experience in Python and Shell Knowledge of distributed systems like Hadoop and Spark as well as cloud computing platforms such as Azure and AWS Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Ruby;automation;Python Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
2 - 5 years
3 - 6 Lacs
Karnataka
Work from Office
SQL Budget -13$ /Hour (18 LPA) Location - Bangalore 3-5 years Primary skill is SQL we are looking for someone with expertise in SQL, advanced SQLbasic working knowledge of Python, Shell scriptinggood to have Airflow and MDM conceptsNeed to work from Bangalore office of customer few days from customer and few days remotely.Good knowledge on SQLbasic knowledge on Python, Shellgood to have Airflow and MDM concepts
Posted 2 months ago
3 - 7 years
1 - 5 Lacs
Telangana
Work from Office
Location Chennai and Hyderbad preferred but customer is willing to take resources from Hyderabad Experience 5 to 8 yrs ( U3 ). Exp - 5- 10 Yrs Location - Hyderabad / Chennai Location Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive).
Posted 2 months ago
2 - 6 years
5 - 9 Lacs
Uttar Pradesh
Work from Office
Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive)
Posted 2 months ago
3 - 6 years
3 - 6 Lacs
Hyderabad
Work from Office
Description Strong knowledge in ETL testing using Big Data/DWH Core hands on testing skills in Bigdata ecosystem i.e. Hbase hive Solar Good knowledge of test data management process and test management tools like QC/ALM and Qtest Working knowledge of FRD BRD mapping documents based on requirements Understanding of Agile methodology Working knowledge of Azure devops boards and JIRA Strong knowledge on complex queries such as SQL HQL Knowledge of UNIX commands Knowledge of TOSCA is plus Knowledge of Putty and IBM data studio . Automation knowledge Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60236 (P) Software Engineering Local Role Name 6362 Software Developer Local Skills 5151 ETL Testing Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
3 - 6 years
3 - 6 Lacs
Hyderabad
Work from Office
Description Strong knowledge in ETL testing using Big Data/DWH Core hands on testing skills in Bigdata ecosystem i.e. Hbase hive Solar Good knowledge of test data management process and test management tools like QC/ALM and Qtest Working knowledge of FRD BRD mapping documents based on requirements Understanding of Agile methodology Working knowledge of Azure devops boards and JIRA Strong knowledge on complex queries such as SQL HQL Knowledge of UNIX commands Knowledge of TOSCA is plus Knowledge of Putty and IBM data studio . Automation knowledge Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60236 (P) Software Engineering Local Role Name 6362 Software Developer Local Skills 5151 ETL Testing Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
5 - 7 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title Spark Python Scala Developer Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Data Processing->Spark Preferred Skills: Technology->Big Data - Data Processing->Spark Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
5 - 8 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title Big Data Analyst Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - NoSQL->MongoDB Preferred Skills: Technology->Big Data - NoSQL->MongoDB Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements
Posted 2 months ago
3 - 5 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title Big Data Analyst Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Preferred Skills: Technology->Big Data->Oracle BigData Appliance Educational Requirements Bachelor of Engineering Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements
Posted 2 months ago
5 - 7 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title HADOOP ADMIN Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Hadoop->Hadoop Administration Preferred Skills: Technology->Big Data - Hadoop->Hadoop Administration Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
6 - 10 years
10 - 12 Lacs
Mysore
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
2 - 5 years
7 - 11 Lacs
Hyderabad
Work from Office
Broad knowledge and experience in: Understanding of Big Data Engineering/processing, Business Intelligence and Advanced analytics Developing ETL/ELT processes Knowledge in databases and Data warehouse modeling Knowledge in Cloud based data engineering and Machine Learning Models Knowledge in building APIs for application integration Experience with various frameworks and processes, such as Agile Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical Data Model Work with Data Scientists to implement strategies for cleaning and preparing data for analysis, to develop data imputation algorithms, and optimize performance of big data and machine learning systems Above average skills in: Big Data Engineering and Processing using Hadoop stack (Hadoop, Hive, HDFS, Spark and HBase etc.) Develop ETL/ELT processing using Apache Ni-Fi Strong background on SQL and databases Programming Skills in Python or Scala Data Analysis and Validation skills Demonstrated ability to: Work in a dynamic, fast-paced, work environment Self-motivated with the ability to work under minimal direction To adapt to new technologies and learn quickly A passion for data and information with strong analytical, problem solving, and organizational skills Work in multi-functional groups, with diverse interests and requirements, to a common objective Communicate very well with distributed teams (written, verbal and presentation)
Posted 2 months ago
2 - 6 years
9 - 13 Lacs
Bengaluru
Work from Office
Telecom - Service Assurance domain experience with strong knowledge in networking concepts. Experience in Requirements Gathering with data analysis skills Good understanding on technical components like Kafka, NiFi, HBASE, Graph Database and Postgres Ability to understand the functional design and design the technical solution using the components involved in the platform Experience in documenting the detailed design like FDD and TDD Experience in using the tools like Jira and Confluence Good communication and stakeholder management skills
Posted 2 months ago
7 - 10 years
25 - 32 Lacs
Chennai, Pune, Greater Noida
Hybrid
Good in: Spark, Scala, Hadoop, Hive, HBase, Python, Kafka Deep understanding: Distributed system of Spark & Scala Good in: Core Spark API & Spark Streaming APIs, ETL pipelines, HDFS, Hive, S3, Mongo DB & SQL 79 Yrs Old Reputed MNC Company
Posted 2 months ago
8 - 13 years
50 - 55 Lacs
Bengaluru
Work from Office
Lead Engineer for customer experience platforms is a hands-on development role that own the architecture, technical design, realization for a complex micro services ecosystem and build a greenfield open stack platform for a large-scale ecommerce initiative. Modernize the technology stack for the ecommerce cart/checkout and/or post purchase and/or customer service platform based on MACH architecture principles with controlled data duplication. Model the use of modern software engineering practices (i.e.,TDD, BDD, CI/CD, Shift left, 12 factor applications etc.), API design, and architecture to support integration with existing Ford software products as well as external cloud-based services. Collaborate with Ford s Enterprise Architecture organization to rationalize technologies, further leverage current technology offerings already in use, and identify gaps/opportunities. Overall responsibility and accountability for API design and backend Spring boot microservices suite on Google cloud platform Strong hands-on experience of at least 10+ years in Java/J2EE/Spring framework/Spring boot Experience with the following: Microservices architectures (using Spring boot/Micronaut/Ratpack/Quarkus), Cloud-Native architectures, Event-driven architectures, APIs, Domain-Driven Design, Public Cloud (Google Cloud), Serverless, Kubernetes, Docker, DevOps, building scalable, reliable, available solutions, and/or performance testing. Strong technical background with the capability of being hands-on Conversant in multiple programming languages. Thorough knowledge of multi-threading, concurrency, and parallel processing concepts including scalability, performance, and consistency characteristics of a microservices driven eCommerce architecture Good expertise in REST, Messaging (KAFKA, RABBITMQ, cloud pub/sub etc.), stream processing (SPARK, STORM etc.), NoSQL as well as database Systems (RDBMS, NO SQL Stores like Cassandra, HBase, Mongo, Memcached etc.) Experience in Cloud Native systems, Transactional Systems, Multi-Tenancy, five-nines availability and Containerization technologies Experience in collaborating and partnering with other technical domain experts such as cloud, security, SRE, and DevOps. Experience in building Structured, Semi-Structured and Unstructured data stores with a good understanding of RDBMS, No-SQL databases and strong exposure to data modelling, data access patterns, data replication, active-active polyglot persistence setup. Experience in implementation of CQRS and staged event driven applications on Spring integration/Apache Camel/Mulesoft platforms Leverage LLMs, AI-powered coding assistants like GitHub Copilot to enhance productivity and code quality Responsible for overall design and evolution of one or mode modules (Microservices) in one of the eCommerce products. Ideal candidates will research the existing application footprint and recommend solutions to run application workloads in futuristic Architecture landscape Bring commerce platform engineering expertise and experience to significantly improve Ford s current capabilities and ensure these platforms can grow to meet increasing demands Design, build POCs on latest cutting-edge technologies as well as contribute to constructing, deploying highly scalable and robust cloud based intelligent solutions. Contribute to Ford s Product Driven Organization (PDO) model by identifying improvements and areas that help to reduce dependencies and increase autonomy for teams to delivery.
Posted 2 months ago
4 - 9 years
6 - 11 Lacs
Pune
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers
Posted 2 months ago
3 - 7 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Databricks Good to have skills : Microsoft Azure Modern Data Platform, Apache Spark Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead for Custom Software Engineering, you will be responsible for designing, building, and configuring applications using Microsoft Azure Databricks. Your typical day will involve leading the effort to deliver high-quality solutions, collaborating with cross-functional teams, and ensuring timely delivery of projects. Roles & Responsibilities: Lead the effort to design, build, and configure applications using Microsoft Azure Databricks. Act as the primary point of contact for the project, collaborating with cross-functional teams to ensure timely delivery of high-quality solutions. Utilize your expertise in Scala Programming Language, Apache Spark, and Microsoft Azure Modern Data Platform to develop and implement efficient and scalable solutions. Ensure adherence to best practices and standards for software development, including code reviews, testing, and documentation.-Build performance-oriented Scala code, optimized for Databricks/Spark execution Provide peer support to other members of Team on Azure data bricks/Spark / Scala best practices Improve Performance of Calculation Engine Develop proof of concepts using new technologies Develop new applications to meet regulatory commitments (e.g:FRTB) Professional & Technical Skills: Proficiency in Scala Programming Language. Experience with Apache Spark and Microsoft Azure Modern Data Platform. Strong understanding of software development best practices and standards. Experience with designing, building, and configuring applications using Microsoft Azure Databricks. Experience with data processing and analysis using big data technologies. Excellent problem-solving and analytical skills. Build performance-oriented Scala code, optimized for Databricks/Spark execution Provide peer support to other members of Team on Azure data bricks/Spark / Scala best practices Improve Performance of Calculation Engine Develop proof of concepts using new technologies Develop new applications to meet regulatory commitments (e.g:FRTB) Qualification 15 years full time education
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
HBase is a distributed, scalable, and NoSQL database that is commonly used in big data applications. As the demand for big data solutions continues to grow, so does the demand for professionals with HBase skills in India. Job seekers looking to explore opportunities in this field can find a variety of roles across different industries and sectors.
These cities are known for their strong presence in the IT industry and are actively hiring professionals with HBase skills.
The salary range for HBase professionals in India can vary based on experience and location. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the HBase domain, a typical career progression may look like: - Junior HBase Developer - HBase Developer - Senior HBase Developer - HBase Architect - HBase Administrator - HBase Consultant - HBase Team Lead
In addition to HBase expertise, professionals in this field are often expected to have knowledge of: - Apache Hadoop - Apache Spark - Data Modeling - Java programming - Database design - Linux/Unix
As you prepare for HBase job opportunities in India, make sure to brush up on your technical skills, practice coding exercises, and be ready to showcase your expertise in interviews. With the right preparation and confidence, you can land a rewarding career in the exciting field of HBase. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2