Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
6 - 10 years
10 - 14 Lacs
Mysore
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
2 - 5 years
14 - 17 Lacs
Mysore
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
8 - 13 years
30 - 35 Lacs
Bengaluru
Work from Office
Qualitest India Private Limited is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 months ago
6 - 10 years
8 - 12 Lacs
Mysore
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
4 years
0 Lacs
Hyderabad, Telangana, India
Description Do you want to be a leader in the team that takes Transportation and Retail models to the next generation? Do you have a solid analytical thinking, metrics driven decision making and want to solve problems with solutions that will meet the growing worldwide need? Then Transportation is the team for you. We are looking for top notch Data Engineers to be part of our world class Business Intelligence for Transportation team. 4-7 years of experience performing quantitative analysis, preferably for an Internet or Technology company Strong experience in Data Warehouse and Business Intelligence application development Data Analysis: Understand business processes, logical data models and relational database implementations Expert knowledge in SQL. Optimize complex queries. Basic understanding of statistical analysis. Experience in testing design and measurement. Able to execute research projects, and generate practical results and recommendations Proven track record of working on complex modular projects, and assuming a leading role in such projects Highly motivated, self-driven, capable of defining own design and test scenarios Experience with scripting languages, i.e. Perl, Python etc. preferred BS/MS degree in Computer Science Evaluate and implement various big-data technologies and solutions (Redshift, Hive/EMR, Tez, Spark) to optimize processing of extremely large datasets in an accurate and timely fashion. Experience with large scale data processing, data structure optimization and scalability of algorithms a plus Key job responsibilities Responsible for designing, building and maintaining complex data solutions for Amazon's Operations businesses Actively participates in the code review process, design discussions, team planning, operational excellence, and constructively identifies problems and proposes solutions Makes appropriate trade-offs, re-use where possible, and is judicious about introducing dependencies Makes efficient use of resources (e.g., system hardware, data storage, query optimization, AWS infrastructure etc.) Knows about recent advances in distributed systems (e.g., MapReduce, MPP Architectures, External Partitioning) Asks correct questions when data model and requirements are not well defined and comes up with designs which are scalable, maintainable and efficient Makes enhancements that improve team’s data architecture, making it better and easier to maintain (e.g., data auditing solutions, automating, ad-hoc or manual operation steps) Owns the data quality of important datasets and any new changes/enhancements Basic Qualifications 3+ years of data engineering experience 4+ years of SQL experience Experience with data modeling, warehousing and building ETL pipelines Preferred Qualifications Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A2941103
Posted 2 months ago
5 - 10 years
0 - 1 Lacs
Pune
Work from Office
Position Overview: Cloud Architect with expertise in Hadoop and Google Cloud Platform (GCP) Data Stack , along with experience in Big Data Architecture and Migration . The ideal candidate should have strong proficiency in GCP Big Data tools , including Hadoop, Hive, HDFS, Impala, Spark, MapReduce, MS SQL, Kafka, and Redis . Familiarity with Cloudera, HBase, MongoDB, MariaDB, and Event Hub is a plus. Key Responsibilities: Design, implement, and optimize Big Data architecture on GCP, and Hadoop ecosystems . Lead data migration projects from on-premise to cloud platforms (GCP). Develop and maintain ETL pipelines using tools like Spark, Hive, and Kafka . Manage Hadoop clusters, HDFS, and related components . Work with data streaming technologies like Kafka and Event Hub for real-time data processing. Optimize SQL and NoSQL databases (MS SQL, Redis, MongoDB, MariaDB, HBase) for high availability and scalability. Collaborate with data scientists, analysts, and DevOps teams to integrate Big Data solutions. Ensure data security, governance, and compliance in cloud and on-premise environments. Required Skills & Experience: 5-10 years of experience as Cloud Architect Strong expertise in Hadoop (HDFS, Hive, Impala, Spark, MapReduce) Hands-on experience with GCP Big Data Services Proficiency in MS SQL, Kafka, Redis for data processing and analytics Experience with Cloudera, HBase, MongoDB, and MariaDB Knowledge of real-time data streaming and event-driven architectures Understanding Big Data security and performance optimization Ability to design and execute data migration strategies Location : Koregaon Park, Pune, Maharashtra (India) Shift Timings : USA Time Zone (06:30 PM IST to 03:30 AM IST)
Posted 2 months ago
1 - 6 years
2 - 5 Lacs
Hyderabad
Work from Office
Sahaj Retail Limited is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 months ago
1 - 5 years
3 - 7 Lacs
Allahabad, Noida
Work from Office
Feather Thread Corporation is looking for Bigdata administrator to join our dynamic team and embark on a rewarding career journey. Office Management:Oversee general office operations, including maintenance of office supplies, equipment, and facilities Manage incoming and outgoing correspondence, including mail, email, and phone calls Coordinate meetings, appointments, and travel arrangements for staff members as needed Administrative Support:Provide administrative support to management and staff, including scheduling meetings, preparing documents, and organizing files Assist with the preparation of reports, presentations, and other materials for internal and external stakeholders Maintain accurate records and databases, ensuring data integrity and confidentiality Communication and Coordination:Serve as a point of contact for internal and external stakeholders, including clients, vendors, and partners Facilitate communication between departments and team members, ensuring timely and effective information flow Coordinate logistics for company events, meetings, and conferences Documentation and Compliance:Assist with the development and implementation of company policies, procedures, and guidelines Maintain compliance with regulatory requirements and industry standards Ensure proper documentation and record-keeping practices are followed Project Support:Provide support to project teams by assisting with project coordination, documentation, and tracking of tasks and deadlines Collaborate with team members to ensure project deliverables are met on time and within budget
Posted 2 months ago
10 - 14 years
35 - 40 Lacs
Bengaluru
Work from Office
Your Job The Lead Data Engineer will be a part of an international team that designs, develops and delivers new applications for Koch Industries. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Services (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees Our Team What You Will Do Work with business partners to understand key business drivers and use that knowledge to experiment and transform Business Intelligence & Advanced Analytics solutions to capture the value of potential business opportunities Improve data pipeline reliability, scalability, and security. Design, build and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements. Translate a business process/problem into a conceptual and logical data model and proposed technical implementation plan Work closely with the Product Owners and stake holders to design the Technical Architecture for data platform to meet the requirements of the proposed solution. Provides technical leadership in Big Data space (Hadoop Stack like Spark, M/R, HDFS, Hive, NoSQL stores like Cassandra, HBase etc, Databased like Snowflake, RDS etc) Assist in developing and implementing consistent processes for data modeling, mining, and production Experience in architecting software solution on public cloud. Help the Data Engineering team produce high-quality code that allows us to put solutions into production Creating reusable and scalable data pipelines Focus on implementing development processes and tools that allow for the collection of metadata, access to metadata, and completed in a way that allows for widespread code reuse (e.g., utilization of ETL Frameworks, Generic Metadata driven Tools, shared data dimensions, etc.) that will enable impact analysis as well as source to target tracking and reporting Create and own the technical product backlogs for products, help the team to close the backlogs in right time. Refactor code into reusable libraries, APIs, and tools. Who You Are (Basic Qualifications) 10+ years of industry professional experience or a bachelors degree in MIS, CS, or an industry equivalent with consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions At least 6-8 years of Data Engineering experience (AWS) in delivering Advance10+ years of industry professional experience or a bachelors degree in MIS, CS, or an industry equivalent with consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions At least 6-8 years of Data Engineering experience (AWS) in delivering Advance Analytics solution, Data Warehousing, Big Data or Cloud. Should have strong knowledge in SQL, developing, deploying, and modelling DWH and data pipelines on AWS cloud or similar other cloud environments. 5+ years of experience with business and technical requirements analysis, elicitation, data modeling, verification, and methodology development with a good hold of communicating complex technical ideas to technical and non-technical team members Manage data related requests, analyze issues and provide efficient resolution. Design all program specifications and perform required tests. Experience in authoring or reviewing system design documents for enterprise solutions. Knowledge of Big Data technologies, such as Spark, Hadoop/MapReduce. Strong coding skills in Java and Python or Scala. Demonstrated experience using git-based source control management platforms (Gitlab, GitHub, DevOps, etc.). Experience of working in Agile delivery Experience in Data Harmonization, Master Data Management & Critical Data Elements Management What Will Put You Ahead 8+ years experience in the Amazon Web Services stack experience including S3, Athena, Redshift, Glue, or Lambda 8+ years experience with cloud data warehousing solutions including Snowflake with developing in and implementation of dimensional modeling Experience with Open-source tools & integration with AWS platform will be preferred Certified as Cloud Architect from a reputed public cloud. Experience with Git and CICD pipelines. Development experience with docker and a Kubernetes environment (would be a plus) Understanding of infrastructure (including hosting, container-based deployments and storage architectures) would be advantageous.
Posted 2 months ago
6 - 8 years
8 - 10 Lacs
Mysore
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
2 - 5 years
14 - 17 Lacs
Hyderabad
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
2 - 5 years
14 - 17 Lacs
Mysore
Work from Office
As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure
Posted 2 months ago
2 - 7 years
4 - 8 Lacs
Chennai
Work from Office
? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities
Posted 2 months ago
5 - 7 years
11 - 13 Lacs
Nasik, Pune, Nagpur
Work from Office
Euclid Innovations Pvt Ltd is looking for Data Engineer Drive to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
Posted 2 months ago
8 - 13 years
30 - 35 Lacs
Pune
Work from Office
Data Engineer1 Job Description Job Description Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Data Engineer - Knows HR Knowledge , all other requirement from Functional Area given by UBER Customer Name Customer Name uber
Posted 2 months ago
3 - 7 years
3 - 7 Lacs
Karnataka
Work from Office
Description Detailed JD RTIM Pega CDH 8.8 Multi App Infinity 24.1 Java Restful API oAuth 1. Understanding the NBA requirements and the complete CDH architecture 2. Review of the Conceptual Design Detailed Design and estimations. 3. Reviewing and contributing to the deployment activities and practices 4. Contributing to overall technical solution and putting it to practise. 5. Contributing to the requirement discussion with the Subject matter expertise in relation to Pega CDH 6. Experience in Pega CDH v8.8 multi app or 24.1 and retail banking domain is preferred. 7. Conducting peer code reviews 8. Excellent communications skills Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family 60236 (P) Software Engineering Local Role Name 6362 Software Developer Local Skills 5700 Pega Languages RequiredEnglish Role Rarity To Be Defined
Posted 2 months ago
2 - 5 years
3 - 7 Lacs
Karnataka
Work from Office
EXP 4 to 6 yrs Location Any PSL Location Rate below 14$ JD - DBT/AWS Glue/Python/Pyspark Hands-on experience in data engineering, with expertise in DBT/AWS Glue/Python/Pyspark. Strong knowledge of data engineering concepts, data pipelines, ETL/ELT processes, and cloud data environments (AWS) Technology DBT, AWS Glue, Athena, SQL, Spark, PySpark Good understanding of Spark internals and how it works. Goot skills in PySpark Good understanding of DBT basically should be to understand DBT limitations and when it will end-up in model explosion Good hands-on experience in AWS Glue AWS expertise should know different services and should know how to configure them and infra-as-code experience Basic understanding of different open data formats Delta, Iceberg, Hudi Ability to engage in technical conversations and suggest enhancements to the current Architecture and design"
Posted 2 months ago
2 - 5 years
3 - 7 Lacs
Maharashtra
Work from Office
Description Overall 10+ years of experience in Python and Shell Knowledge of distributed systems like Hadoop and Spark as well as cloud computing platforms such as Azure and AWS Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade B Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Ruby;automation;Python Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
3 - 7 years
1 - 5 Lacs
Telangana
Work from Office
Location Chennai and Hyderbad preferred but customer is willing to take resources from Hyderabad Experience 5 to 8 yrs ( U3 ). Exp - 5- 10 Yrs Location - Hyderabad / Chennai Location Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive).
Posted 2 months ago
2 - 6 years
5 - 9 Lacs
Uttar Pradesh
Work from Office
Proven experience as a development data engineer or similar role, with ETL background. Experience with data integration / ETL best practices and data quality principles. Play a crucial role in ensuring the quality and reliability of the data by designing, implementing, and executing comprehensive testing. By going over the User Stories build the comprehensive code base and business rules for testing and validation of the data. Knowledge of continuous integration and continuous deployment (CI/CD) pipelines. Familiarity with Agile/Scrum development methodologies. Excellent analytical and problem solving skills. Strong communication and collaboration skills. Experience with big data technologies (Hadoop, Spark, Hive)
Posted 2 months ago
5 - 10 years
5 - 8 Lacs
Bengaluru
Work from Office
Description Primary Skill Kafka Cluster Management,Kafka admin,Kubernetes,Helm ,DevOps, Jenkins Secondary Skill:Grafana, Prometheus, Dynatrace Named Job Posting? (if Yes - needs to be approved by SCSC) Additional Details Global Grade C Level To Be Defined Named Job Posting? (if Yes - needs to be approved by SCSC) No Remote work possibility No Global Role Family To be defined Local Role Name To be defined Local Skills Kafka Cluster Management;Kubernetes Languages RequiredENGLISH Role Rarity To Be Defined
Posted 2 months ago
5 - 7 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title Spark Python Scala Developer Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Data Processing->Spark Preferred Skills: Technology->Big Data - Data Processing->Spark Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
5 - 8 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title Big Data Analyst Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - NoSQL->MongoDB Preferred Skills: Technology->Big Data - NoSQL->MongoDB Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Educational Requirements Bachelor of Engineering Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements
Posted 2 months ago
3 - 5 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Title Big Data Analyst Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Preferred Skills: Technology->Big Data->Oracle BigData Appliance Educational Requirements Bachelor of Engineering Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements
Posted 2 months ago
5 - 7 years
4 - 8 Lacs
Bengaluru
Work from Office
Job Title HADOOP ADMIN Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills:Technology->Big Data - Hadoop->Hadoop Administration Preferred Skills: Technology->Big Data - Hadoop->Hadoop Administration Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit * Location of posting is subject to business requirements
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2