Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
5 - 10 years
14 - 17 Lacs
Kochi
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala"
Posted 2 months ago
2 - 5 years
14 - 17 Lacs
Pune
Work from Office
Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 3-5 years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala"
Posted 2 months ago
2 - 5 years
14 - 17 Lacs
Bengaluru
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 3-5 years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala"
Posted 2 months ago
2 - 5 years
14 - 17 Lacs
Mysore
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
2 - 7 years
4 - 8 Lacs
Chennai
Work from Office
? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities
Posted 2 months ago
5 - 7 years
11 - 13 Lacs
Nasik, Pune, Nagpur
Work from Office
Euclid Innovations Pvt Ltd is looking for Data Engineer Drive to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 months ago
8 - 13 years
30 - 35 Lacs
Pune
Work from Office
Data Engineer1 Job Description Job Description Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Data Engineer - Knows HR Knowledge , all other requirement from Functional Area given by UBER Customer Name Customer Name uber
Posted 2 months ago
3 - 7 years
3 - 7 Lacs
Karnataka
Work from Office
Description Detailed JD RTIM Pega CDH 8.8 Multi App Infinity 24.1 Java Restful API oAuth 1. Understanding the NBA requirements and the complete CDH architecture 2. Review of the Conceptual Design Detailed Design and estimations. 3. Reviewing and contributing to the deployment activities and practices 4. Contributing to overall technical solution and putting it to practise. 5. Contributing to the requirement discussion with the Subject matter expertise in relation to Pega CDH 6. Experience in Pega CDH v8.8 multi app or 24.1 and retail banking domain is preferred. 7. Conducting peer code reviews 8. Excellent communications skills Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60236 (P) Software Engineering Local Role Name 6362 Software Developer Local Skills 5700 Pega Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
2 - 5 years
3 - 7 Lacs
Karnataka
Work from Office
EXP 4 to 6 yrs Location Any PSL Location Rate below 14$ JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic understanding of different open data formats Delta, Iceberg, Hudi Ability to engage in technical conversations and suggest enhancements to the current Architecture and design"
Posted 2 months ago
2 - 5 years
3 - 7 Lacs
Maharashtra
Work from Office
Description Overall 10+ years of experience in Python and Shell Knowledge of distributed systems like Hadoop and Spark as well as cloud computing platforms such as Azure and AWS Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Ruby;automation;Python Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
3 - 7 years
1 - 5 Lacs
Telangana
Work from Office
Location Chennai and Hyderbad preferred but customer is willing to take resources from Hyderabad Experience 5 to 8 yrs ( U3 ). Exp - 5- 10 Yrs Location - Hyderabad / Chennai Location Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive).
Posted 2 months ago
2 - 6 years
5 - 9 Lacs
Uttar Pradesh
Work from Office
Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive)
Posted 2 months ago
5 - 7 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title Spark Python Scala Developer Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Data Processing->Spark Preferred Skills: Technology->Big Data - Data Processing->Spark Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
5 - 8 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title Big Data Analyst Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - NoSQL->MongoDB Preferred Skills: Technology->Big Data - NoSQL->MongoDB Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements
Posted 2 months ago
3 - 5 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title Big Data Analyst Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Preferred Skills: Technology->Big Data->Oracle BigData Appliance Educational Requirements Bachelor of Engineering Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements
Posted 2 months ago
5 - 7 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title HADOOP ADMIN Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Hadoop->Hadoop Administration Preferred Skills: Technology->Big Data - Hadoop->Hadoop Administration Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
6 - 10 years
10 - 12 Lacs
Mysore
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
8 - 12 years
13 - 17 Lacs
Bengaluru
Work from Office
About team: Roundel is Targets entry into the media business with an impact of $1.5B+; an advertising sell-side business built on the principles of first party (people based) data, brand safe content environments and proof that our marketing programs drive business results for our clients. We are here to drive business growth for our clients and redefine value in the industry by solving core industry challenges vs. copy current industry methods of operation. Roundel is a key growth initiative for Target and lead the industry to a better way of operating within the media marketplace. Target Tech is on a mission to offer the systems, tools and support that our clients, guests and team members need and deserve. We drive industry-leading technologies in support of every angle of the business, and help ensure that Target operates smoothly, securely and reliably from the inside out As a Sr. Engineering Manager , you will lead an engineering team in an agile environment building solutions in support of AdTech platform. The key to the success of this position is having strong & innovative approach to problem solving, great technical leadership, excellent communication (written and verbal, formal and informal), flexibility, and a self-motivated working style with attention to detail. You will collaborate with cross functional teams in delivering AdTech capabilities including DSP, SSP, and Ad Servers, for self-service advertising needs. Use your skills, experience and talents to be a part of groundbreaking thinking and visionary goals. As a Sr. Engineering manager, youll take the lead as you Software Development Organization: Manage overall software development cycle, driving best practices and ensuring development of high quality code for common assets and framework components. Build and lead a team of high caliber software development engineers Architect and develop the best technical design and approach Manage and execute against project/agile plans and set deadlines Drive resolution of technology roadblocks including code, infrastructure, build and deployment Manage cross-product technical dependencies and drive resolutions to conflicts Advocate for technologies, frameworks, design patterns, processes and guiding values of the domain architecture Understand AdTech business fundamentals and how technologies can support business goals Ensure all code adheres to all development & security standards Requirements for Software Development: 8+ years of engineering (software development) experience. Experience with at least one full cycle implementation from requirement to production. Experience with database design, data structures including relational database, BigData and at least one NoSQL storage. Experience in Architecture & Design of scalable data pipelines in BigData. 1-3 years of managing software development teams with a strong track record of project delivery for large, cross-functional projects Experience operating medium to large scale systems Experience with test-driven development and software test automation Strong sense of ownership Strong written and verbal communication skills with the ability to present complex technical information in a clear and concise manner to variety of audiences Desired Qualifications: 4 year degree in or equivalent experience Experience with Java, Groovy, Spring, UI and Reporting/Visualization framework Experience in Hadoop, Hive, MapReduce, Oozie, Spark and Scala Marketing Tech or Ad Tech experience with a track record of innovation Contribute back to the Open Source community is desirable Extensive experience working in an agile environment (i.e. user stories, iterative development, etc.)
Posted 2 months ago
6 - 10 years
9 - 13 Lacs
Bengaluru
Work from Office
A role with Business team means being a part of the team that enables faster, smarter and more scalable decision-making to compete and win the modern retail market. Here, youll leverage data, statistics and visualization to create the actionable insights that deliver value across all Target functions. Our savvy reporting and analytics pros use market-leading tools and data automation to make a positive impact thats felt across the business. If youre an analyst who can work on autonomous teams, integrate the latest practices with your approach, write code to handle massive scale, simplify complex decisions by providing flexible, fast and sustainable decision-making solutions, then youll be successful here.As a Sr. Data Analyst you will support all business areas of Target with critical data analysis that helps team members make profitable decisions. Become a data expert or business analyst and utilize tools like decision trees, clustering, regression, time series, structural equation modeling, linear programming, SAS, SQL, VBA and OLAP. Use your skills, experience and talents to be a part of groundbreaking thinking and visionary goals. Interface with the Target business representatives and leaders to validate business requirements/requests for reporting solutions. Determine best methods in order to gather data and present information. Build reporting solutions to meet business needs. Communicate with project/team manager to share knowledge and findings. Document design and requirements for reporting solutions. Core responsibilities of this job are described within this job description. Job duties may change at any time due to business need. About You : 6-10years of overall Industry experience with 4-6years in data eco system Proven hands-on experience with Domo, Tableau & other Visualization tools (incl homegrown platforms) and a deep understanding of core DW/BI concepts Strong SQL or PL/SQL programming & Unix shell scripting skills Hands on experience in object oriented or functional programming such as Scala &/or Python or Java Should have solid experience in Hadoop ecosystem and its components around writing programs using Map-Reduce, should have experience in developing Hive and Pig Scripts, should have experience in designing and developing Oozie workflows Git source code management Experience working in an agile environment Exposure to R, Python, Hive or other open source languages Understanding of foundational mathematics and statistics Conceptual understanding of analytical techniques like Linear Regression, Logistic Regression, Time-series models, Classification Techniques, etc. Strong written and verbal communication skills to explain complex analytical methodologies to clients regardless of the clients technical expertise Any experience with Retail, Merchandising, Marketing will be strong addons B.Tech / B.E. or Masters in Statistics /Econometrics/Mathematics equivalent.
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Andhra Pradesh
Work from Office
JD -7+ years of hands on experience in Python especially dealing with Pandas and Numpy Good hands-on experience in Spark PySpark and Spark SQL Hands on experience in Databricks Unity Catalog Delta Lake Lake house Platform Medallion Architecture Azure Data Factory ADLS Experience in dealing with Parquet and JSON file format Knowledge in Snowflake.
Posted 2 months ago
7 - 8 years
15 - 25 Lacs
Chennai
Work from Office
Assistant Manager - Data Engineering: Job Summary: We are seeking a Lead GCP Data Engineer with experience in data modeling and building data pipelines. The ideal candidate should have hands-on experience with GCP services such as Composer, GCS, GBQ, Dataflow, Dataproc, and Pub/Sub. Additionally, the candidate should have a proven track record in designing data solutions, covering everything from data integration to end-to-end storage in bigquery. Responsibilities: Collaborate with Client's Data Architect: Work closely with client data architects and technical teams to design and develop customized data solutions that meet business requirements. Design Data Flows: Architect and implement data flows that ensure seamless data movement from source systems to target systems, facilitating real-time or batch data ingestion, processing, and transformation. Data Integration & ETL Processes: Design and manage ETL processes, ensuring the efficient integration of diverse data sources and high-quality data pipelines. Build Data Products in GBQ: Work on building data products using Google BigQuery (GBQ), designing data models and ensuring data is structured and optimized for analysis. Stakeholder Interaction: Regularly engage with business stakeholders to gather data requirements and translate them into technical specifications, building solutions that align with business needs. Ensure Data Quality & Security: Implement best practices in data governance, security, and compliance for both storage and processing of sensitive data. Continuous Improvement: Evaluate and recommend new technologies and tools to improve data architecture, performance, and scalability. Skills: 6+ years of development experience 4+ years of experience with SQL, Python 2+ GCP BigQuery, DataFlow, GCS, Postgres 3+ years of experience building out data pipelines from scratch in a highly distributed and fault-tolerant manner. Experience with CloudSQL, Cloud Functions and Pub/Sub, Cloud Composer etc., Familiarity with big data and machine learning tools and platforms. Comfortable with open source technologies including Apache Spark, Hadoop, Kafka. Comfortable with a broad array of relational and non-relational databases. Proven track record of building applications in a data-focused role (Cloud and Traditional Data Warehouse) Current or previous experience leading a team. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status.
Posted 2 months ago
8 - 13 years
30 - 35 Lacs
Mumbai
Work from Office
Paramatrix Technologies Pvt Ltd is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Dadra and Nagar Haveli, Chandigarh
Work from Office
Data engineer Skills Required: Strong proficiency in Pyspark , Scala , and Python Experience with AWS Glue Experience Required: Minimum 5 years of relevant experience Location: Available across all UST locations Notice Period: Immediate joiners (Candidates available to join by 31st January 2025 ) SO - 22978624 Location - Chandigarh,Dadra & Nagar Haveli,Daman,Diu,Goa,Haveli,Jammu,Lakshadweep,Nagar,New Delhi,Puducherry,Sikkim
Posted 2 months ago
2 - 7 years
4 - 9 Lacs
Hyderabad
Work from Office
JR REQ---Data Engineer(Pyspark, Big mailto:data)--4to8year---hyd----hemanth.karanam@tcs.com----TCS C2H ---900000
Posted 2 months ago
2 - 5 years
14 - 17 Lacs
Hyderabad
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2