Jobs
Interviews

3311 Big Data Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

5 - 9 Lacs

Kolkata

Remote

Were hiring a Python Backend Developer to build reliable APIs and services. Key Responsibilities:Design and build scalable backend services. Create APIs and handle database interactions. Ensure code performance and maintainability. Work with devops and deployment tools. Required Qualifications:2+ years in backend development with Python. Experience with Flask/Django and REST APIs. Familiarity with SQL/NoSQL databases.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

5 - 9 Lacs

Mumbai

Remote

Were hiring a Python Backend Developer to build reliable APIs and services. Key Responsibilities:Design and build scalable backend services. Create APIs and handle database interactions. Ensure code performance and maintainability. Work with devops and deployment tools. Required Qualifications:2+ years in backend development with Python. Experience with Flask/Django and REST APIs. Familiarity with SQL/NoSQL databases.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

5 - 9 Lacs

Hyderabad

Remote

Were hiring a Python Backend Developer to build reliable APIs and services. Key Responsibilities:Design and build scalable backend services. Create APIs and handle database interactions. Ensure code performance and maintainability. Work with devops and deployment tools. Required Qualifications:2+ years in backend development with Python. Experience with Flask/Django and REST APIs. Familiarity with SQL/NoSQL databases.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

5 - 9 Lacs

Bengaluru

Remote

Were hiring a Python Backend Developer to build reliable APIs and services. Key Responsibilities:Design and build scalable backend services. Create APIs and handle database interactions. Ensure code performance and maintainability. Work with devops and deployment tools. Required Qualifications:2+ years in backend development with Python. Experience with Flask/Django and REST APIs. Familiarity with SQL/NoSQL databases.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Bengaluru

Remote

About Us: Soul AI, built by IIT/IIM founders, is a fast-growing AI company creating meaningful tech solutions. With teams in SF and Hyderabad, we blend engineering excellence with real-world applications. We are looking for a Python Backend Software Engineer to help build scalable backend systems. Key Responsibilities: Develop robust backend services and APIs. Design scalable architecture and handle data modeling. Write clean, maintainable, and testable Python code. Collaborate with frontend, DevOps, and data teams. Required Qualifications: 2+ years in backend development using Python. Experience with Django, Flask, or FastAPI. Strong understanding of REST APIs and SQL/NoSQL databases.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Mumbai

Remote

About Us: Soul AI, built by IIT/IIM founders, is a fast-growing AI company creating meaningful tech solutions. With teams in SF and Hyderabad, we blend engineering excellence with real-world applications. We are looking for a Python Backend Software Engineer to help build scalable backend systems. Key Responsibilities: Develop robust backend services and APIs. Design scalable architecture and handle data modeling. Write clean, maintainable, and testable Python code. Collaborate with frontend, DevOps, and data teams. Required Qualifications: 2+ years in backend development using Python. Experience with Django, Flask, or FastAPI. Strong understanding of REST APIs and SQL/NoSQL databases.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Kolkata

Remote

About Us: Soul AI, built by IIT/IIM founders, is a fast-growing AI company creating meaningful tech solutions. With teams in SF and Hyderabad, we blend engineering excellence with real-world applications. We are looking for a Python Backend Software Engineer to help build scalable backend systems. Key Responsibilities: Develop robust backend services and APIs. Design scalable architecture and handle data modeling. Write clean, maintainable, and testable Python code. Collaborate with frontend, DevOps, and data teams. Required Qualifications: 2+ years in backend development using Python. Experience with Django, Flask, or FastAPI. Strong understanding of REST APIs and SQL/NoSQL databases.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Hyderabad

Remote

About Us: Soul AI, built by IIT/IIM founders, is a fast-growing AI company creating meaningful tech solutions. With teams in SF and Hyderabad, we blend engineering excellence with real-world applications. We are looking for a Python Backend Software Engineer to help build scalable backend systems. Key Responsibilities:Develop robust backend services and APIs. Design scalable architecture and handle data modeling. Write clean, maintainable, and testable Python code. Collaborate with frontend, DevOps, and data teams. Required Qualifications:2+ years in backend development using Python. Experience with Django, Flask, or FastAPI. Strong understanding of REST APIs and SQL/NoSQL databases.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Hyderabad

Remote

We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).

Posted 2 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Mumbai

Remote

We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).

Posted 2 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Kolkata

Remote

We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).

Posted 2 weeks ago

Apply

2.0 - 4.0 years

7 - 11 Lacs

Bengaluru

Remote

We are hiring a Python-based Data Engineer to develop ETL processes and data pipelines. Key Responsibilities : Build and optimize ETL/ELT data pipelines. Integrate APIs and large-scale data ingestion systems. Automate data workflows using Python and cloud tools. Collaborate with data science and analytics teams. Required Qualifications: 2+ years in data engineering using Python. Familiar with tools like Airflow, Pandas, and SQL. Experience with cloud data services (AWS/GCP/Azure).

Posted 2 weeks ago

Apply

2.0 - 6.0 years

30 - 35 Lacs

Bengaluru

Work from Office

FunctionSoftware Engineering, Backend Development Responsibilities: You will work on building the biggest neo-banking app of India You will own the design process, implementation of standard software engineering methodologies while improving performance, scalability and maintainability You will be translating functional and technical requirements into detailed design and architecture You will be collaborating with UX designers and product owners for detailed product requirements You will be part of a fast growing engineering group You will be responsible for mentoring other engineers, defining our tech culture and helping build a fast growing team Requirements: 2-6 years of experience in product development, design and architecture Hands on expertise in at least one of the following programming languages Java, Python NodeJS and Go Hands on expertise in SQL and NoSQL databases Expertise in problem solving, data structure and algorithms Deep understanding and experience in object oriented design Ability in designing and architecting horizontally scalable software systems Drive to constantly learn and improve yourself and processes surrounding you Mentoring, collaborating and knowledge sharing with other engineers in the team Self-starter Strive to write the optimal code possible day in day out What you will get:

Posted 2 weeks ago

Apply

3.0 - 5.0 years

1 - 5 Lacs

Mumbai

Work from Office

GradeM2/M3 RoleHO Product Team - TPP Ensure timely & accurate reporting of all business and product MIS Collecting information from different stakeholders, preparation of MIS on transactions and performing reconciliation activities Designing dashboards depicting business insights to management Analyzing and highlighting various trends, anomalies and issues observed in different processes Implementing systems that promote automation and reduce manual activities for both reporting and operational activities Handling ad-hoc MIS requests from Business / CS team Preparation of analytical presentations and slide shows for reviews Suggest process improvements & system enhancements Periodic process changes to be incorporated in the SOP Co-ordination with various teams to streamline reports Adherence to Information Security norms & quality process norms To be aware of and comply with any updates about the process Streamlining and ensuring uniformity of logic across various products and processes Act on the feedback given by Team Leader/Team Coach or Quality or on the coaching provided to the team as guidelines for improving performance Job Requirements: Graduate with a decent English communication Proficiency in MS Excel, advanced formulas and functions Experience in reporting, MIS and data analysis handling big data, using pivot tables and crunching numbers Proficiency in other MS office applications Word and PowerPoint Knowledge of Power BI , Power Query, Python, VBA, SQL or other analytics tools and languages, will be an added advantage Education and experience required for this role is Graduate with 3-5 years of work experience; a Management Degree will be added advantage. Candidate should have a pleasing personality and should be presentable Should be dedicated and display integrity Willingness to learn with an attitude of continuous improvement

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 11 Lacs

Pune

Work from Office

Job Title: Java Developer Experience Required: 3+ Years Location: Pune Employment Type: Full Time, Permanent Job Summary: We are looking for a passionate and skilled Java Developer with 3 to 5 years of hands-on experience to join our high-performance engineering team. The ideal candidate should have strong fundamentals in Core Java , especially in Multithreading, Collections , and Performance Optimization . This role involves working with modern build tools and writing Java-based applications over Big Data technologies like Apache Spark and Apache Flink . Key Responsibilities: Design, develop, and maintain high-performance, scalable Java applications Write clean, maintainable, and efficient Java code Implement multithreading and concurrent programming in back-end systems Use Java APIs and Collection Frameworks effectively Manage builds and dependencies using Maven and/or Gradle Debug and resolve complex issues using tools like jstack , jcmd , etc. Perform performance tuning and JVM optimization Collaborate with cross-functional teams for feature delivery Write unit tests and follow TDD practices Utilize Git for version control and code integration Desired Skills: Strong proficiency in Core Java and Advanced Java Solid understanding of Multithreading , Concurrency , and Java Memory Model Expertise in Spring Boot , REST APIs , and Microservices architecture Hands-on with databases: SQL , NoSQL , and caching tools like Redis , Aerospike Experience with Maven , Gradle for build management Skilled in debugging with JVM tools (heap/thread dumps, jcmd, jstack) Exposure to JVM internals , Garbage Collection , and performance profiling Bonus: Experience with Go Lang , Python , or React Familiarity with Cloud platforms like AWS , GCP , or Azure Working knowledge of Docker , Kubernetes , CI/CD pipelines Good understanding of OOP principles , Design Patterns , and Clean Code practices Comfortable working in Agile/Scrum teams Preferred Skills (Value Add): Experience with logging frameworks: Log4j , SLF4J Knowledge of monitoring tools: Prometheus , Grafana , New Relic Exposure to Linux environments and basic shell scripting Hands-on experience with Microservices and RESTful APIs How to Apply: Interested candidates can share their updated resume to: anurag.yadav@softenger.com WhatsApp: +91 73855 56898 Please include the following in your message/email: Total Experience Relevant Experience Current CTC Expected CTC Notice Period Current Location Willingness to Relocate to [e.g., Pune]

Posted 2 weeks ago

Apply

2.0 - 7.0 years

3 - 8 Lacs

Pune

Work from Office

Job Title: Java Developer Experience: 2 to 7 Years Location: Pune Employment Type: Full Time Job Summary: We are looking for a passionate and skilled Java Developer with 27 years of experience to join our high-performance engineering team. The ideal candidate should have strong command over Core Java , particularly in Multithreading, Collections, Performance Optimization , and JVM tuning . You will also develop applications that work with Big Data frameworks like Spark and Flink . Key Responsibilities: Design, develop, and maintain high-performance, scalable Java applications Write clean, maintainable, and efficient Java code Build scalable backends using multithreading and concurrency Use Collection Framework and Java APIs efficiently Manage project builds using Maven or Gradle Debug production issues using jstack, jcmd, and JVM analysis tools Identify and fix performance bottlenecks Work closely with cross-functional teams to deliver features Follow best practices including unit testing and TDD Use Git for version control, commit, and merge tracking Desired Skills: Proficient in Core and Advanced Java Strong in Multithreading, Concurrency, and Java Memory Model Experience with Spring Boot, REST APIs, Microservices Hands-on with SQL/NoSQL databases and caching (Redis/Aerospike) Solid grasp of Collections and Java internals Proficient with Maven/Gradle build tools Debugging experience using JVM tools (heap/thread dumps, GC tuning) Experience with Big Data tech like Spark/Flink Familiarity with cloud platforms (AWS/GCP/Azure) Exposure to Docker, Kubernetes, CI/CD pipelines Bonus: Experience with Go Lang, Python, React Strong understanding of OOP, design patterns, clean code principles Agile/Scrum team experience Preferred / Value-Add Skills: Logging frameworks: Log4j, SLF4J Monitoring/Observability tools: Prometheus, Grafana, New Relic REST API design & microservices architecture Linux environment and shell scripting Application Instructions: Interested candidates are requested to share their updated resume along with the below details: anurag.yadav@softenger.com WhatsApp: +91 73855 56898 Please include the following in your message/email: Total Experience Relevant Experience Current CTC Expected CTC Notice Period Current Location Willing to Relocate to Pune

Posted 2 weeks ago

Apply

5.0 - 7.0 years

14 - 17 Lacs

Pune

Work from Office

A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done. Preferred technical and professional experience Experience with AEM Core Technologies OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Pune

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 weeks ago

Apply

3.0 - 7.0 years

14 - 18 Lacs

Pune

Work from Office

As an Associate Data Scientist at IBM, you will work to solve business problems using leading edge and open-source tools such as Python, R, and TensorFlow, combined with IBM tools and our AI application suites. You will prepare, analyze, and understand data to deliver insight, predict emerging trends, and provide recommendations to stakeholders. In your role, you may be responsible for: Implementing and validating predictive and prescriptive models and creating and maintaining statistical models with a focus on big data & incorporating machine learning. techniques in your projects Writing programs to cleanse and integrate data in an efficient and reusable manner Working in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors Communicating with internal and external clients to understand and define business needs and appropriate modelling techniques to provide analytical solutions. Evaluating modelling results and communicating the results to technical and non-technical audiences. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Document solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides Preferred technical and professional experience Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms Experience and working knowledge in COBOL & JAVA would be preferred Experience in python and pyspark will be added advantage

Posted 2 weeks ago

Apply

6.0 - 7.0 years

14 - 18 Lacs

Bengaluru

Work from Office

As an Associate Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total Exp-6-7 Yrs (Relevant-4-5 Yrs) Mandatory Skills: Azure Databricks, Python/PySpark, SQL, Github, - Azure Devops - Azure Blob Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Pune

Work from Office

The ability to be a team player The ability and skill to train other people in procedural and technical topics Strong communication and collaboration skills Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Able to write complex SQL queries ; Having experience in Azure Databricks Preferred technical and professional experience Excellent communication and stakeholder management skills

Posted 2 weeks ago

Apply

3.0 - 7.0 years

14 - 18 Lacs

Hyderabad

Work from Office

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours’. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another Document solution architectures, design decisions, implementation details, and lessons learned. Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred technical and professional experience Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus Demonstrate a growth mindset to understand clients' business processes and challenges

Posted 2 weeks ago

Apply

5.0 - 7.0 years

14 - 18 Lacs

Mumbai

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on Azure Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Exposure to streaming solutions and message brokers like Kafka technologies Experience Unix / Linux Commands and basic work experience in Shell Scripting Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 2 weeks ago

Apply

2.0 - 6.0 years

30 - 35 Lacs

Bengaluru

Work from Office

FunctionSoftware Engineering, Backend Development Responsibilities: You will work on building the biggest neo-banking app of India You will own the design process, implementation of standard software engineering methodologies while improving performance, scalability and maintainability You will be translating functional and technical requirements into detailed design and architecture You will be collaborating with UX designers and product owners for detailed product requirements You will be part of a fast growing engineering group You will be responsible for mentoring other engineers, defining our tech culture and helping build a fast growing team Requirements: 2-6 years of experience in product development, design and architecture Hands on expertise in at least one of the following programming languages Java, Python NodeJS and Go Hands on expertise in SQL and NoSQL databases Expertise in problem solving, data structure and algorithms Deep understanding and experience in object oriented design Ability in designing and architecting horizontally scalable software systems Drive to constantly learn and improve yourself and processes surrounding you Mentoring, collaborating and knowledge sharing with other engineers in the team Self-starter Strive to write the optimal code possible day in day out What you will get:

Posted 2 weeks ago

Apply

3.0 - 7.0 years

12 - 16 Lacs

Kolkata

Work from Office

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another Document solution architectures, design decisions, implementation details, and lessons learned. Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation Preferred technical and professional experience Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation leveraging LLM capabilities would be a Big plus Demonstrate a growth mindset to understand clients' business processes and challenges

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies