Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4.0 - 9.0 years
20 - 35 Lacs
Pune, Mumbai (All Areas), India
Hybrid
Exp - 4 to 8 Yrs Location - Pune (Relocation accepted) MUST HAVE - Hands-on experience of atleast 2 years latest with AKKA HTTPS & AKKA Framework along with strong Scala programming
Posted 1 week ago
8.0 - 13.0 years
35 - 50 Lacs
Mumbai
Work from Office
Hiring Big Data Lead with 8+ years experience for US Shift time: Must Have: - Big Data: Spark, Hadoop, Kafka, Hive, Flink - Backend: Python, Scala - NoSQL: MongoDB, Cassandra - Cloud: AWS/AZURE/GCP, Snowflake, Databricks - Docker, Kubernetes, CI/CD Required Candidate profile - Excellent in Mentoring/ Training in Big Data- HDFS, YARN, Airflow, Hive, Mapreduce, Hbase, Kafka & ETL/ELT, real-time streaming, data modeling - Immediate Joiner is plus - Excellent in Communication
Posted 1 week ago
8.0 - 12.0 years
4 - 8 Lacs
Pune
Work from Office
Roles & Responsibilities: Total 8-10 years of working experience Experience/Needs 8-10 Years of experience with big data tools like Spark, Kafka, Hadoop etc. Design and deliver consumer-centric high performant systems. You would be dealing with huge volumes of data sets arriving through batch and streaming platforms. You will be responsible to build and deliver data pipelines that process, transform, integrate and enrich data to meet various demands from business Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects Focus on automation using Infrastructure as a Code (IaaC), Jenkins, devOps etc. Design, build, test and deploy streaming pipelines for data processing in real time and at scale Experience with stream-processing systems like Storm, Spark-Streaming, Flink etc.. Experience with object-oriented/object function scripting languagesScala, Java, etc. Develop software systems using test driven development employing CI/CD practices Partner with other engineers and team members to develop software that meets business needs Follow Agile methodology for software development and technical documentation Good to have banking/finance domain knowledge Strong written and oral communication, presentation and interpersonal skills. Exceptional analytical, conceptual, and problem-solving abilities Able to prioritize and execute tasks in a high-pressure environment Experience working in a team-oriented, collaborative environment 8-10 years of hand on coding experience Proficient in Java, with a good knowledge of its ecosystems Experience with writing Spark code using scala language Experience with BigData tools like Sqoop, Hive, Pig, Hue Solid understanding of object-oriented programming and HDFS concepts Familiar with various design and architectural patterns Experience with big data toolsHadoop, Spark, Kafka, fink, Hive, Sqoop etc. Experience with relational SQL and NoSQL databases like MySQL, PostgreSQL, Mongo dB and Cassandra Experience with data pipeline tools like Airflow, etc. Experience with AWS cloud servicesEC2, S3, EMR, RDS, Redshift, BigQuery Experience with stream-processing systemsStorm, Spark-Streaming, Flink etc. Experience with object-oriented/object function scripting languagesPython, Java, Scala, etc. Expertise in design / developing platform components like caching, messaging, event processing, automation, transformation and tooling frameworks Location:Pune/ Mumbai/ Bangalore/ Chennai
Posted 1 week ago
8.0 - 12.0 years
4 - 8 Lacs
Pune
Work from Office
Job Information Job Opening ID ZR_1581_JOB Date Opened 25/11/2022 Industry Technology Job Type Work Experience 8-12 years Job Title Senior Specialist- Data Engineer City Pune Province Maharashtra Country India Postal Code 411001 Number of Positions 4 Location:Pune/ Mumbai/ Bangalore/ Chennai Roles & Responsibilities: Total 8-10 years of working experience Experience/Needs 8-10 Years of experience with big data tools like Spark, Kafka, Hadoop etc. Design and deliver consumer-centric high performant systems. You would be dealing with huge volumes of data sets arriving through batch and streaming platforms. You will be responsible to build and deliver data pipelines that process, transform, integrate and enrich data to meet various demands from business Mentor team on infrastructural, networking, data migration, monitoring and troubleshooting aspects Focus on automation using Infrastructure as a Code (IaaC), Jenkins, devOps etc. Design, build, test and deploy streaming pipelines for data processing in real time and at scale Experience with stream-processing systems like Storm, Spark-Streaming, Flink etc.. Experience with object-oriented/object function scripting languagesScala, Java, etc. Develop software systems using test driven development employing CI/CD practices Partner with other engineers and team members to develop software that meets business needs Follow Agile methodology for software development and technical documentation Good to have banking/finance domain knowledge Strong written and oral communication, presentation and interpersonal skills. Exceptional analytical, conceptual, and problem-solving abilities Able to prioritize and execute tasks in a high-pressure environment Experience working in a team-oriented, collaborative environment 8-10 years of hand on coding experience Proficient in Java, with a good knowledge of its ecosystems Experience with writing Spark code using scala language Experience with BigData tools like Sqoop, Hive, Pig, Hue Solid understanding of object-oriented programming and HDFS concepts Familiar with various design and architectural patterns Experience with big data toolsHadoop, Spark, Kafka, fink, Hive, Sqoop etc. Experience with relational SQL and NoSQL databases like MySQL, PostgreSQL, Mongo dB and Cassandra Experience with data pipeline tools like Airflow, etc. Experience with AWS cloud servicesEC2, S3, EMR, RDS, Redshift, BigQuery Experience with stream-processing systemsStorm, Spark-Streaming, Flink etc. Experience with object-oriented/object function scripting languagesPython, Java, Scala, etc. Expertise in design / developing platform components like caching, messaging, event processing, automation, transformation and tooling frameworks check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested
Posted 1 week ago
5.0 - 10.0 years
12 - 16 Lacs
Chennai, Bengaluru
Work from Office
Who We Are Applied Materials is the global leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. We design, build and service cutting-edge equipment that helps our customers manufacture display and semiconductor chips- the brains of devices we use every day. As the foundation of the global electronics industry, Applied enables the exciting technologies that literally connect our world- like AI and IoT. If you want to work beyond the cutting-edge, continuously pushing the boundaries of"science and engineering to make possible"the next generations of technology, join us to Make Possible® a Better Future. What We Offer Location: Bangalore,IND, Chennai,IND At Applied, we prioritize the well-being of you and your family and encourage you to bring your best self to work. Your happiness, health, and resiliency are at the core of our benefits and wellness programs. Our robust total rewards package makes it easier to take care of your whole self and your whole family. Were committed to providing programs and support that encourage personal and professional growth and care for you at work, at home, or wherever you may go. Learn more about our benefits . Youll also benefit from a supportive work culture that encourages you to learn, develop and grow your career as you take on challenges and drive innovative solutions for our customers."We empower our team to push the boundaries of what is possible"”while learning every day in a supportive leading global company. Visit our Careers website to learn more about careers at Applied. Technical Lead - Software About Applied Applied Materials is the leader in materials engineering solutions used to produce virtually every new chip and advanced display in the world. Our expertise in modifying materials at atomic levels and on an industrial scale enables customers to transform possibilities into reality. At Applied Materials, our innovations make possible the technology shaping the future. Our Team Our team is developing a high-performance computing solution for low-latency and high throughput image processing and deep-learning workloads that will enable our Chip Manufacturing process control equipment to offer differentiated value to our customers. Your Opportunity As a technical lead, you will get the opportunity to grow in the field of high-performance computing, complex system design and low-level optimizations for better cost of ownership. Roles and Responsibility As a technical lead, you will be responsible for designing and implementing High performance computing software solutions for our organization. You will work closely with cross-functional teams, including software engineers, product managers, and business stakeholders, to understand requirements and translate them into architectural/software designs that meet business needs. You will be a subject Matter expert to unblock software engineers in the HPC domain. You will be expected to profile systems to understand bottlenecks, optimize workflows and code and processes to improve cost of ownership. Identify and mitigate technical risks and issues throughout the software development lifecycle. Lead the design and implementation of complex software components and systems. Ensure that software systems are scalable, reliable, and maintainable. Mentor and coach junior software engineers. Your primary focus will be on implementing features of high quality with maintainable and extendable code following software development best practices Our Ideal Candidate Someone who has the drive and passion to learn quickly, has the ability to multi-task and switch contexts based on business needs. Qualifications 5 to 10 years of experience in Design and coding in C/C++ preferably in Linux Environment. Very good knowledge of Data structures, Algorithms and Complexity analysis. In depth experience in Multi-threading, Thread Synchronization, Inter process communication, and Distributed computing fundamentals. Very Good knowledge of Operating systems internals (Linux Preferred), Networking and Storage systems. Experience in performance profiling at application and system level (e.g. vtune, Oprofiler, perf, Nividia Nsight etc.) Experience in low level code optimization techniques using Vectorization and Intrinsics, cache-aware programming, lock free data structures etc. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to mentor and coach junior team members. Experience in Agile development methodologies. Additional Qualifications: Experience in GPU programming using CUDA, OpenMP, OpenACC, OpenCL etc. Good Knowledge of Work-flow orchestration Software like Apache Airflow, Apache Spark, Apache storm or Intel TBB flowgraph etc. Experience in developing Distributed High Performance Computing software using Parallel programming frameworks like MPI, UCX etc. Experience in HPC Job-Scheduling and Cluster Management Software (SLURM, Torque, LSF etc.) Good knowledge of Low-latency and high-throughput data transfer technologies (RDMA, RoCE, InfiniBand) Familiarity with microservices architecture and containerization technologies (docker/singularity) and low latency Message queues. Education : Bachelors Degree or higher in Computer science or related Disciplines. Applied Materials is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans and Individuals with Disabilities. Additional Information Time Type: Full time Employee Type: Assignee / Regular Travel: Yes, 10% of the Time Relocation Eligible: Yes Applied Materials is an Equal Opportunity Employer. Qualified applicants will receive consideration for employment without regard to race, color, national origin, citizenship, ancestry, religion, creed, sex, sexual orientation, gender identity, age, disability, veteran or military status, or any other basis prohibited by law.
Posted 2 weeks ago
5.0 - 7.0 years
2 - 5 Lacs
Pune
Work from Office
Job Title:Data Engineer Experience5-7Years Location:Pune : Roles & Responsibilities: Create and maintain optimal data pipeline architecture Build data pipelines that transform raw, unstructured data into formats that data analyst can use to for analysis Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvementsautomating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and delivery of data from a wide variety of data sources using SQL and AWS ‘Big Data’ technologies Work with stakeholders including the Executive, Product, and program teams to assist with data-related technical issues and support their data infrastructure needs. Work with data and analytics experts to strive for greater functionality in our data systems Develops and maintains scalable data pipelines and builds out new integrations and processes required for optimal extraction, transformation, and loading of data from a wide variety of data sources using HQL and 'Big Data' technologies Implements processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it Write unit/integration tests, contribute to engineering wiki, and document work Performs root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement Who You Are: You’re passionate about Data and building efficient data pipelines You have excellent listening skills and empathetic to others You believe in simple and elegant solutions and give paramount importance to quality You have a track record of building fast, reliable, and high-quality data pipelines Passionate with good understanding of data, with a focus on having fun, while delivering incredible business results Must have skills: AData Engineerwith 5+ years of relevant experience who is excited to apply their current skills and to grow their knowledge base. A Data Engineer who has attained a degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. Has experience using the following software/tools: Experience with big data tools:Hadoop, Spark, Kafka, Hive etc. Experience with relationalSQLandNoSQL databases, including Postgres and Cassandra Experience withdata pipelineandworkflow management tools Experience with AWS cloud services:EC2, EMR, RDS, Redshift Experience with object-oriented/object function scripting languages:Python, Java, Scala, etc. Experience withAirflow/Ozzie Experience inAWS/Spark/Python development Experience inGIT, JIRA, Jenkins, Shell scripting Familiar withAgile methodology,test-driven development, source control management and automated testing Build processes supporting data transformation, data structures, metadata, dependencies and workload management Experience supporting and working with cross-functional teams in a dynamic environment Nice to have skills: Experience with stream-processing systems:Storm, Spark-Streaming, etc. a plus Experience withSnowflake
Posted 3 weeks ago
12.0 - 16.0 years
18 - 25 Lacs
Hyderabad
Remote
JD for Fullstack Developer. Exp: 10+yrs Front End Development Design and implement intuitive and responsive user interfaces using React.js or similar front-end technologies. Collaborate with stakeholders to create a seamless user experience. Create mockups and UI prototypes for quick turnaround using Figma, Canva, or similar tools. Strong proficiency in HTML, CSS, JavaScript, and React.js. Experience with styling and graph libraries such as Highcharts, Material UI, and Tailwind CSS. Solid understanding of React fundamentals, including Routing, Virtual DOM, and Higher-Order Components (HOC). Knowledge of REST API integration. Understanding of Node.js is a big advantage. Middleware Development Experience with REST API development, preferably using FastAPI. Proficiency in programming languages like Python. Integrate APIs and services between front-end and back-end systems. Experience with Docker and containerized applications. Back End Development Experience with orchestration tools such as Apache Airflow or similar. Design, develop, and manage simple data pipelines using Databricks, PySpark, and Google BigQuery. Medium-level expertise in SQL. Basic understanding of authentication methods such as JWT and OAuth. Bonus Skills Experience working with cloud platforms such as AWS, GCP, or Azure. Familiarity with Google BigQuery and Google APIs. Hands-on experience with Kubernetes for container orchestration. Contact : Sandeep Nunna Ph No : 9493883212 Email : sandeep.nunna@clonutsolutions,com
Posted 3 weeks ago
4.0 - 9.0 years
4 - 8 Lacs
Pune
Work from Office
EDUCATION & EXPERIENCE : - A professional degree in Computer Science from a reputable institution, backed by a consistent academic record. - A knack for problem-solving, data structures, and algorithms. - Proficiency in ElasticSearch. - 4+ years of hands-on development experience, primarily in building products for large enterprises. - Exceptional communication skills. - Mastery in Java programming; familiarity with Python is a plus. - Experience with Spring Boot. - Practical knowledge of one or more cloud-based technologies (e.g., ElasticSearch, Storm, Hazelcast, MongoDB, Ceph, Kafka) is highly desirable. - Expertise in building concurrent and/or parallelized, highly performant scalable applications. - A track record of identifying and addressing complex issues in scalable deployments. - Exposure to Service-Oriented Architecture (SOA) and Test-Driven Development (TDD) is an added advantage. ROLES & RESPONSIBILITIES : - Dive deep into technical aspects (Analysis, Design & Implementation) as required. - Take complete ownership of features within the product. - Engage in debates and detailed discussions about functional and non-functional requirements with our Product Management team. - Collaborate with the team to design solutions, seeking stakeholder input before implementation. - Create essential artifacts such as functional specifications and detailed designs for your assigned features. - Implement intricate features with an unwavering commitment to quality, following the Test- Driven Development (TDD) process. - Maintain open lines of communication, promptly reporting risks and progress to your supervising manager. - Share your expertise and mentor team members. - Provide support by troubleshooting and creating Root Cause Analysis (RCA) for production issues, subsequently working on short-term and long-term solutions. The job is for Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 3 weeks ago
9 - 11 years
37 - 40 Lacs
Ahmedabad, Bengaluru, Mumbai (All Areas)
Work from Office
Dear Candidate, We are hiring a Scala Developer to work on high-performance distributed systems, leveraging the power of functional and object-oriented paradigms. This role is perfect for engineers passionate about clean code, concurrency, and big data pipelines. Key Responsibilities: Build scalable backend services using Scala and the Play or Akka frameworks . Write concurrent and reactive code for high-throughput applications . Integrate with Kafka, Spark, or Hadoop for data processing. Ensure code quality through unit tests and property-based testing . Work with microservices, APIs, and cloud-native deployments. Required Skills & Qualifications: Proficient in Scala , with a strong grasp of functional programming Experience with Akka, Play, or Cats Familiarity with Big Data tools and RESTful API development Bonus: Experience with ZIO, Monix, or Slick Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies
Posted 1 month ago
6 - 9 years
32 - 35 Lacs
Chennai, Noida, Kolkata
Work from Office
Dear Candidate, We are hiring a Ruby on Rails Developer to create clean, scalable web applications and APIs. You will work on backend systems and contribute to full-stack development using Ruby and modern web technologies. Key Responsibilities: Design and develop web applications using Ruby on Rails . Build RESTful APIs and integrate third-party services. Optimize applications for performance, scalability, and security. Collaborate with frontend and DevOps teams for seamless deployments. Write unit tests using RSpec and maintain code quality. Required Skills & Qualifications: Strong experience in Ruby, Ruby on Rails (RoR) Familiar with PostgreSQL, Redis, Sidekiq Experience with JavaScript/Stimulus.js, Turbo, Tailwind CSS Comfortable with Git, CI/CD pipelines, and containerized environments Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies
Posted 2 months ago
8 - 11 years
45 - 50 Lacs
Chennai, Noida, Kolkata
Work from Office
Dear Candidate, We are hiring a Scala Developer to work on scalable data pipelines, distributed systems, and backend services. This role is perfect for candidates passionate about functional programming and big data. Key Responsibilities: Develop data-intensive applications using Scala . Work with frameworks like Akka, Play, or Spark . Design and maintain scalable microservices and ETL jobs. Collaborate with data engineers and platform teams. Write clean, testable, and well-documented code. Required Skills & Qualifications: Strong in Scala, Functional Programming, and JVM internals Experience with Apache Spark, Kafka, or Cassandra Familiar with SBT, Cats, or Scalaz Knowledge of CI/CD, Docker, and cloud deployment tools Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 2 months ago
4 - 6 years
6 - 10 Lacs
Gurgaon
Work from Office
Skills : software development, predominantly using Microsoft .Net (C#), C#, ASP.Net MVC, ASP .Net Core, Web API, OOP, Design Patterns and SOLID principles, MS SQL or other relational databases, Good Communicatio Required Candidate profile Notice Period: 0- 30 days Education: B. Tech or BCA/MCA equivalent Fulltime Employment
Posted 2 months ago
7 - 10 years
20 - 22 Lacs
Chennai, Pune, Noida
Work from Office
Experience in Java, Apache Kafka, Streams, Clusters Application Development, Topic Management, Data Pipeline Development, Producer & Consumer Implementation, Integration & Connectivity, Cluster Administration, Security & Compliance, Apache Zookeeper Required Candidate profile 7 -10 year exp in Kafka Expertis, Programming Skill, Big Data & Streaming Technologie, Database Knowledge, Cloud & DevOp, Event-Driven Architecture, Security & Scalability, Problem Solving & Teamwork
Posted 2 months ago
7 - 9 years
4 - 8 Lacs
Ahmedabad
Work from Office
Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Apache Kafka Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : Minimum 15 years of fulltime education Summary :As a Software Development Engineer, you will be responsible for analyzing, designing, coding, and testing multiple components of application code using Apache Kafka. Your typical day will involve performing maintenance, enhancements, and/or development work across one or more clients in Ahmedabad. Roles & Responsibilities: Design, develop, and maintain high-performance, scalable, and fault-tolerant distributed systems using Apache Kafka. Collaborate with cross-functional teams to identify and resolve complex technical issues, ensuring the highest levels of availability, performance, and security. Develop and maintain automated tests to ensure the quality of the software, including unit tests, integration tests, and end-to-end tests. Participate in code reviews, ensuring adherence to coding standards, best practices, and design patterns. Continuously learn and stay up-to-date with the latest technologies and industry trends, applying innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Expertise in Apache Kafka. Good To Have Skills:Experience with other distributed systems such as Apache Spark, Apache Flink, or Apache Storm. Strong understanding of distributed systems, including fault tolerance, scalability, and performance. Experience with Java or Scala programming languages. Experience with containerization technologies such as Docker and Kubernetes. Solid grasp of software development best practices, including agile methodologies, continuous integration and delivery, and test-driven development. Additional Information: The candidate should have a minimum of 7.5 years of experience in Apache Kafka. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering high-quality software solutions. This position is based at our Ahmedabad office. Qualification Minimum 15 years of fulltime education
Posted 2 months ago
5 - 10 years
7 - 14 Lacs
Bengaluru
Work from Office
About The Role : Role Purpose The purpose of the role is to liaison and bridging the gap between customer and Wipro delivery team to comprehend and analyze customer requirements and articulating aptly to delivery teams thereby, ensuring right solutioning to the customer. Job Role & Responsibilities: As a Sr. Analytics consultant the individual contributor will be responsible for the performance of the overall Operations Quality, speech analytics category design and leveraging AI/ML solution to identify conversational AI use cases and optimize contact center interactions across channels and improve end user experience & advocacy of enterprise Clients. 60% - QA Automation, creating syntax, fine tuning syntaxes, building categories, building score cards, category validation, etc. 20% - Analyzing trends for process improvement, documentation, Coordinate with business stakeholders to understand functional objectives and identify areas of opportunity, etc. 20% - Creating WBR / MBR slides, Training, collaborating with cross function teams & Attending Client Calls, etc. Ability to understand the organizations business objectives and goals, analyze recorded audio and audio-based data sets for critical insights, provide ongoing speech category design, determine opportunities through analysis of current or future trends, collaborating and synthesizing analyses for the execution of scorecards. Structures an analytical approach to finding and solving core business problems seeking to advance the current efforts or processes. Clearly articulates thoughts and ideas in oral and written presentations. Contributes creative ideas to conduct thorough analysis to estimate risk/reward, assists other users with data analysis as needed and work cross-functionally with various groups internally and externally. Own the end-to-end process, from recognizing the problem to implementing the solution. Skills and Qualifications: 10+ years of relevant analytics experience. Graduates in any stream. Analytical or quantitative field preferred. Skills with Verint, CallMiner, Nexidia, NICE, Uniphore, Calabrio or other speech analytics package preferred. Skills on extrapolating data, developing queries, ensuring standardization of external files, leading projects that relate directly to the speech analytic platform, and make recommendations based on understanding of speech analytics, file structures, IT tools, and statistics to assess opportunities effectively. Ability to multi-task, responds well to pressure and deadlines, and work well individually and in a team environment, e xperience with text mining, parsing, and classification using state-of-the-art techniques. Experience with information retrieval, Natural Language Processing, Natural Language Understanding and Neural Language Modeling (BERT, LSTM, Transformers, LLM etc.). Hands on Python data science ecosystem:Pandas, NumPy, SciPy, Scikit-learn, NLTK, Genism, etc. Ability to evaluate quality of ML models and to define the right performance metrics for models in accordance with the requirements of the core platform. Good experience with deep understanding in both traditional and modern data architecture and processing concepts, including relational databases (e.g., SQL Server, MySQL, Oracle), Data warehousing, big data (Hadoop, Spark, Storm), NoSQL, data ingestion (Batch and real time processing) and creating data pipeline etc. Hands on BI tools such as Tableau, Power BI, PPT for client demo and presentations Competencies Client CentricityPassion for ResultsCollaborative WorkingProblem Solving & Decision MakingEffective communication
Posted 2 months ago
3 - 5 years
0 - 1 Lacs
Gurgaon
Remote
Role & responsibilities : Understand the values and vision of the organization Protect the Intellectual Property Adhere to all the policies and procedures Design, develop, and maintain scalable data pipelines for data ingestion, processing and storage. Build and optimize data architectures and data models for efficient data storage and retrieval. Develop ETL processes to transform and load data from various sources into data warehouses and data lakes. Ensure data integrity, quality, and security across all data systems. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs. Monitor and troubleshoot data pipelines and workflows to ensure high availability and performance. Document data processes, architectures, and data flow diagrams. Implement and maintain data integration solutions using industry-standard tools and technologies (e.g., Apache Spark, Kafka, Airflow). Preferred candidate profile : Expertise on Data Integration, processing & Storage. Expertise on Data optimization architecture, data process and data flow. Knowledge of Data integration tools like Apache Spark, Kafka & Airflow. Proficiency in SQL and at least one programming language (e.g., Python, Scala). Experience with cloud data platforms (e.g., AWS, Azure, GCP) and their data services. Experience with data visualization tools (e.g., Tableau, Power BI). Other Relevant Information : Bachelors degree in Computer Science, Information Technology, or a related field. Minimum 3 years of experience in data engineering & architecture
Posted 3 months ago
8 - 13 years
18 - 27 Lacs
Bengaluru
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.
Posted 3 months ago
8 - 13 years
18 - 30 Lacs
Pune
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.
Posted 3 months ago
8 - 13 years
18 - 25 Lacs
Hyderabad
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.
Posted 3 months ago
10 - 20 years
35 - 40 Lacs
Pune
Work from Office
Senior Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in a number of engineering competencies. They have extensive knowledge of design and architectural patterns. They will provide engineering thought leadership within their teams, and will play a role in mentoring and coaching of less experienced engineers. Your key responsibilities Hands-on software development Solution Design and Architecture ownership Experience in Agile and Scrum delivery Should be able to contribute towards good software design Participate in daily stand-up meetings Strong communication with stakeholders Articulate issues and risks to management in timely manner Train other team members to bring them up to speed Your skills and experience Extensive experience with java and related technologies such as Spring Core/Spring Boot/Hibernate/MyBatis Experience in developing application using data processing frameworks such as Spring Batch, Apache Beam, Apache Storm Experience with a wide variety of open source tools and frameworks JMS/JPA/JAX-WS/JAX-RS/JAX-B/JTA standards Xml binding. Parsers and xml schemas/xpath/xslt Experience with SSL/X.509 Certificates/Keystores Core java concepts such as lambdas and functional programming, streams, Generics, Concurrency Memory Management Tuning and Troubleshooting, experience with profiling and monitoring tools Knowledge of solution design and architecture including UML Design Patterns Refactoring Architecture decisions, quality attributes, documentation Experience in Agile Experience with Messaging and integration, Patterns, REST, SOA Experience with build and deployment Maven/Artifactory/Teamcity or Jenkins Unix scripting and hands on experience Performance engineering , different types of tests, measurement, monitoring, tools Performance tuning and troubleshooting Knowledge of emerging trends and technologies Experience with end to end design and delivery of solutions RDBMS Oracle design, development, tuning Nice to have Experience with cloud technologies such as Docker, Kubernetes, Openshift, Azure Experience with Big data Streaming technologies Experience with UI frameworks like Angular or React Any additional languages such as python, scala Sun/Oracle or architecture specific certifications Educational Qualifications Bachelors Masters in Computer Science or relevant field.
Posted 3 months ago
10 - 12 years
30 - 35 Lacs
Mumbai
Work from Office
Key Responsibilities: Develop and implement IT strategies aligned with business goals and operational efficiency. Lead, mentor, and guide IT teams across both hardware and software divisions. Implement and manage cyber security measures for software platforms, applications, and IT infrastructure. Identify opportunities for digital innovation, automation, and system enhancements. Stay updated with emerging technologies (cloud computing, AI, IoT, etc.) and recommend practical implementation. Collaborate with external vendors and ensure smooth IT operations. Manage the IT budget, optimize resource allocation, and ensure cost-effective technology investments. Spearhead system integration and digital transformation projects. Manage Project Management Tools (ASANA/JIRA/ZOHO/CLICKUPS/HIVES) Knowledge of compliance with data protection regulations and industry standards (DPDA,GDPR, etc.). Knowledge of COBIT, ISO 27001, ITIL Preferred Qualifications: Education: B.E./M.E. in Computer Engineering or equivalent. Experience: 10-12 years in an IT leadership role, with hands-on experience in: o Ecommerce applications, trending technologies, cyber security, UI/UX design, AI/ML. o Experience with Microsoft .NET (C#, ASP.NET, MVC, Core, etc.),Python, Android, Flutter. o Oversee the development of mobile applications using Flutter for cross-platform deployment (Android and iOS) and native Android development. o Ensure the integration of mobile apps with back-end web services (APIs), databases, and e- commerce platforms. o IT infrastructure management, cloud platforms (AWS, Aurora), and data centre operations. o Architecting and designing large-scale applications using diverse technologies. o Experience in software development, digital transformation, and system integration. Skills : Strong understanding of IT infrastructure, networking, cloud services, and cybersecurity. Expertise in architecting scalable applications and digital platforms. Proven ability to lead and mentor a technical team. Passionate about transforming how fresh produce is bought and consumed in India. Commitment to sustainable agriculture and delivering fresh, safe food to customers.
Posted 3 months ago
5 - 10 years
7 - 12 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Storm Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary : As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for ensuring that the applications are developed according to specifications and delivered on time. Your typical day will involve collaborating with the team to understand the requirements, designing and coding the applications, and testing and debugging them to ensure they function properly. Additionally, you will be responsible for documenting the application design and providing support and maintenance as needed. Roles & Responsibilities: - Expected to be an SME, collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Design and build applications according to business process and application requirements. - Collaborate with the team to understand the requirements and translate them into technical specifications. - Code and test applications to ensure they meet the specified requirements and function properly. - Debug and fix any issues or bugs in the applications. - Document the application design, including technical specifications and user manuals. - Provide support and maintenance for the applications, including troubleshooting and resolving any issues that arise. - Stay up-to-date with the latest industry trends and technologies to continuously improve the applications. - Train and mentor junior developers to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills:Proficiency in Apache Storm. - Strong understanding of distributed computing principles and concepts. - Experience with real-time data processing and stream processing frameworks. - Knowledge of data integration and ETL processes. - Familiarity with programming languages such as Java or Scala. - Good To Have Skills:Experience with Apache Kafka. - Experience with big data technologies such as Hadoop or Spark. - Knowledge of cloud platforms and services, such as AWS or Azure. Additional Information: - The candidate should have a minimum of 5 years of experience in Apache Storm. - This position is based at our Hyderabad office. - A 15 years full time education is required. Qualifications 15 years full time education
Posted 3 months ago
12 - 17 years
14 - 19 Lacs
Pune, Bengaluru
Work from Office
Project Role : Application Architect Project Role Description : Provide functional and/or technical expertise to plan, analyze, define and support the delivery of future functional and technical capabilities for an application or group of applications. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Must have skills : Manufacturing Operations Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : BTech BE Job Title:Industrial Data ArchitectSummary:We are seeking a highly skilled and experienced Industrial Data Architect with a proven track record in providing functional and/or technical expertise to plan, analyse, define and support the delivery of future functional and technical capabilities for an application or group of applications. Well versed with OT data quality, Data modelling, data governance, data contextualization, database design, and data warehousing.Must have Skills:Domain knowledge in areas of Manufacturing IT OT in one or more of the following verticals Automotive, Discrete Manufacturing, Consumer Packaged Goods, Life ScienceKey Responsibilities: Industrial Data Architect will be responsible for developing and overseeing the industrial data architecture strategies to support advanced data analytics, business intelligence, and machine learning initiatives. This role involves collaborating with various teams to design and implement efficient, scalable, and secure data solutions for industrial operations. Focused on designing, building, and managing the data architecture of industrial systems. Assist in facilitating impact assessment efforts and in producing and reviewing estimates for client work requests. Own the offerings and assets on key components of data supply chain, data governance, curation, data quality and master data management, data integration, data replication, data virtualization. Create scalable and secure data structures, integrating with existing systems and ensuring efficient data flow.Qualifications: Data Modeling and Architecture:oProficiency in data modeling techniques (conceptual, logical, and physical models).oKnowledge of database design principles and normalization.oExperience with data architecture frameworks and methodologies (e.g., TOGAF). Database Technologies:oRelational Databases:Expertise in SQL databases such as MySQL, PostgreSQL, Oracle, and Microsoft SQL Server.oNoSQL Databases:Experience with at least one of the NoSQL databases like MongoDB, Cassandra, and Couchbase for handling unstructured data.oGraph Databases:Proficiency with at least one of the graph databases such as Neo4j, Amazon Neptune, or ArangoDB. Understanding of graph data models, including property graphs and RDF (Resource Description Framework).oQuery Languages:Experience with at least one of the query languages like Cypher (Neo4j), SPARQL (RDF), or Gremlin (Apache TinkerPop). Familiarity with ontologies, RDF Schema, and OWL (Web Ontology Language). Exposure to semantic web technologies and standards. Data Integration and ETL (Extract, Transform, Load):oProficiency in ETL tools and processes (e.g., Talend, Informatica, Apache NiFi).oExperience with data integration tools and techniques to consolidate data from various sources. IoT and Industrial Data Systems:oFamiliarity with Industrial Internet of Things (IIoT) platforms and protocols (e.g., MQTT, OPC UA).oExperience with either of IoT data platforms like AWS IoT, Azure IoT Hub, and Google Cloud IoT Core.oExperience working with one or more of Streaming data platforms like Apache Kafka, Amazon Kinesis, Apache FlinkoAbility to design and implement real-time data pipelines. Familiarity with processing frameworks such as Apache Storm, Spark Streaming, or Google Cloud Dataflow.oUnderstanding of event-driven design patterns and practices. Experience with message brokers like RabbitMQ or ActiveMQ.oExposure to the edge computing platforms like AWS IoT Greengrass or Azure IoT Edge AI/ML, GenAI:oExperience working on data readiness for feeding into AI/ML/GenAI applicationsoExposure to machine learning frameworks such as TensorFlow, PyTorch, or Keras. Cloud Platforms:oExperience with cloud data services from at least one of the providers like AWS (Amazon Redshift, AWS Glue), Microsoft Azure (Azure SQL Database, Azure Data Factory), and Google Cloud Platform (BigQuery, Dataflow). Data Warehousing and BI Tools:oExpertise in data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery).oProficiency with Business Intelligence (BI) tools such as Tableau, Power BI, and QlikView. Data Governance and Security:oUnderstanding of data governance principles, data quality management, and metadata management.oKnowledge of data security best practices, compliance standards (e.g., GDPR, HIPAA), and data masking techniques. Big Data Technologies:oExperience in big data platforms and tools such as Hadoop, Spark, and Apache Kafka.oUnderstanding of distributed computing and data processing frameworks. Excellent Communication:Superior written and verbal communication skills, with the ability to effectively articulate complex technical concepts to diverse audiences. Problem-Solving Acumen:A passion for tackling intricate challenges and devising elegant solutions. Collaborative Spirit:A track record of successful collaboration with cross-functional teams and stakeholders. Certifications:AWS Certified Data Engineer Associate / Microsoft Certified:Azure Data Engineer Associate / Google Cloud Certified Professional Data Engineer certification is mandatory Minimum of 14-18 years progressive information technology experience. Qualifications BTech BE
Posted 3 months ago
3 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Hadoop Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Minimum 15 years full time of education Engineering Graduate Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements using Apache Hadoop. Your typical day will involve working with the Hadoop ecosystem, developing and testing applications, and troubleshooting issues. Roles & Responsibilities: Design, develop, and test Hadoop applications using Apache Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Collaborate with cross-functional teams to identify and prioritize business requirements, and translate them into technical specifications. Troubleshoot and debug Hadoop applications, and provide timely resolution to issues. Ensure the performance, scalability, and reliability of Hadoop applications by optimizing code and configurations, and implementing best practices. Stay updated with the latest advancements in Hadoop and big data technologies, and integrate innovative approaches for sustained competitive advantage. Professional & Technical Skills: Must To Have Skills:Strong experience in Apache Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Good To Have Skills:Experience with other big data technologies such as Apache Kafka, Apache Storm, and Apache Cassandra. Solid understanding of distributed computing principles and Hadoop architecture. Experience with programming languages such as Java, Scala, or Python. Experience with Linux/Unix operating systems and shell scripting. Experience with version control systems such as Git or SVN. Additional Information: The candidate should have a minimum of 5+ years of experience in Apache Hadoop. The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions. This position is based at our Hyderabad office. Qualifications Minimum 15 years full time of education Engineering Graduate
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2