Jobs
Interviews

278 Yarn Jobs - Page 6

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 10.0 years

18 - 20 Lacs

Chennai

Work from Office

Lead Engineer is a hands-on development and technical oversight role responsible for application feature design, understand the architecture and ensure to deliver by leading the team for development. The ideal candidate will have a knowledge and understanding of modern software engineering practices, API design, and architecture, and be adept in recent design standards and trends. They should also have experience working with microservices-based backend systems and be familiar with Next. js, React JS, and/or AEM stack. Prior experience working on the Google on the Google Cloud Platform is highly desirable Lead the development and implementation of responsive front-end UI architecture for micro frontends, ensuring high performance, scalability, and maintainability. Model the use of modern software engineering practices (BDD, CI/CD, Shift left, 12-factor applications, etc. ), API design, and architecture to support integration with existing Ford software products as well as external cloud-based services. Collaborate with Frontend and Backend Architects to rationalize design, translate into requirements for implementation. Work closely with product managers, designers, and backend engineers to ensure that the UI meets the needs of our users and is aligned with the overall product vision. Develop and maintain a robust and scalable UI architecture using NextJS / React JS as the primary technology. Work as an expert on UI Design Principles to develop, structure and design of digital apps across all screen size/devices. Lead and mentor a team of UI developers, providing technical guidance and ensuring best practices are followed. Create wireframes, mockups, prototypes, and documentation to communicate design ideas and concepts Work with cross-functional teams to integrate UI components with backend services and APIs. Good experience with highly scalable applications and Extensive knowledge in JavaScript Design & Architectural Patterns. Test and debug UI issues across different browsers and devices Stay updated with the latest trends and technologies in UI development. Excellent communication and interpersonal skills Ability to work effectively in a remote/virtual work setting with other global team members. Should collaborate with Product managers, Architects and developers from other teams to develop and deliver product within timelines Effectively work with cross-functional teams across the organization - inside and outside of the technology and software organization Strong JS skills, including DOM manipulation and object model know-how. Knowledges on one of popular React workflows, such as Zustand or Redux, for efficient state management. Knowledge in performance optimization, Core Web Vitals & SEO principles. Hands-on experience in modern web development tools like ES6/Babel, React, Node. js, NPM, Yarn, and Webpack. Strong understanding of HTML5, CSS3, JavaScript, and TypeScript, along with knowledge of pre-processors and methodologies such as SMACSS and BEM. Knowledge in working within a monorepo environment (Turborepo, Nx) ensuring efficient code sharing and management (JFrog, Nexus) across multiple micro frontend projects Responsible for overall development and delivery of one or more modules (Micro frontend) in one of the eCommerce products. Research the existing application footprint and recommend solutions to run application workloads in a futuristic Architecture landscape. Develop modules within the eCommerce products, ensuring the development of high-quality front-end components and interfaces. Ensure delivery of high quality code and > 80% code coverage Bring commerce platform engineering expertise and experience to significantly improve Ford s current capabilities and ensure these platforms can grow to meet increasing demands. Should be willing to build POCs on the latest cutting-edge technologies and contribute to constructing and deploying highly scalable and robust cloud-based intelligent solutions. Contribute to Ford s Product Driven Organization (PDO) model by identifying improvements and areas that help reduce dependencies and increase team autonomy for delivery. Knowledge in MACH (Microservices, API-first, Cloud-native, Headless) architecture to design and implement scalable and flexible front-end solutions.

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a React Developer to build responsive, high-performance web applications using modern front-end technologies. Key Responsibilities: Develop and maintain scalable React-based web applications. Build reusable components and front-end libraries. Optimize application performance and ensure cross-browser compatibility. Collaborate with designers and backend developers to deliver seamless user experiences. Required Skills & Qualifications: 8+ years of experience in front-end development using React. Proficiency in JavaScript (ES6+), HTML, CSS, and modern React features (hooks, context). Experience with state management libraries (Redux, Zustand, etc.). Familiarity with testing frameworks like Jest or React Testing Library. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Chennai

Work from Office

Job Type: Full Time Key Responsibilities: Develop reusable, typed frontend components using hooks and modern state management patterns. Ensure responsive UI/UX and cross-browser compatibility. Design RESTful or GraphQL APIs using Express and TypeScript. Model relational schemas and write optimized SQL queries and stored procedures. Optimize database performance using indexes, partitions, and EXPLAIN plans. Write unit and integration tests using the Jest and React Testing Library. Participate actively in code reviews and maintain coding standards. Required Skills React.js with TypeScript (React 16+ with functional components and hooks) Node.js with TypeScript and Express MySQL (schema design, normalization, indexing, query optimization, stored procedures) HTML5, CSS3/Sass, ECMAScript 6+ Git, npm/yarn, Webpack/Vite, ESLint/Prettier, Swagger/OpenAPI Jest, React Testing Library

Posted 1 month ago

Apply

6.0 - 9.0 years

32 - 35 Lacs

Noida, Kolkata, Chennai

Work from Office

Dear Candidate, We are hiring a Phoenix Developer to build real-time web applications using Elixir and the Phoenix framework. Phoenix is ideal for scalable and fault-tolerant systems. Key Responsibilities: Build backend systems using Phoenix and Elixir Create APIs, LiveView components, and real-time features Work with Ecto for database operations and migrations Optimize application performance and fault tolerance Collaborate with frontend teams and deploy on cloud platforms Required Skills & Qualifications: Experience with Elixir , Phoenix , and OTP principles Familiar with Ecto , PostgreSQL , and LiveView Understanding of functional programming and concurrency Bonus: Knowledge of Nerves , GraphQL , or Absinthe Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies

Posted 1 month ago

Apply

10.0 - 20.0 years

10 - 19 Lacs

Chennai

Work from Office

Note: Candidates who have less than 9 years experience aren't eligible Job Summary : We are seeking a talented and motivated Full Stack Developer with strong experience in PHP Laravel for backend development and React.js for frontend interfaces. You will be responsible for developing robust web applications, designing RESTful APIs, and building dynamic user interfaces with modern JavaScript technologies. Share your CV to: kishore@datanetiix.com Key Responsibilities : Develop, test, and maintain web applications using Laravel and React.js. Create clean, scalable, and secure RESTful APIs using Laravel. Design and implement user-facing features in React (functional/class components, hooks, etc.). Work closely with UI/UX designers to turn designs into responsive interfaces. Optimize application performance for speed and scalability. Troubleshoot, debug, and upgrade existing systems. Write clean, maintainable, and well-documented code. Collaborate with cross-functional teams (designers, QA, DevOps). Manage code using version control tools like Git. Required Skills and Qualifications : 6+ years of hands-on experience with Laravel (PHP). 2+ years of experience working with React.js. Strong written and verbal communication skills. Proficient in HTML5, CSS3, JavaScript (ES6+), and modern frontend tooling. Experience with MySQL or other relational databases. Good understanding of RESTful APIs and integration. Familiarity with package managers like Composer and npm/yarn. Experience with version control systems (Git, Bitbucket). Ability to write clean, testable, and scalable code. Nice to Have : Experience with Redux, React Query, or other React libraries. Familiarity with Bootstrap, or Material UI. Knowledge of authentication (JWT, OAuth) and Laravel Passport. Familiarity cloud hosting platforms in AWS.

Posted 1 month ago

Apply

2.0 - 6.0 years

6 - 10 Lacs

Nagpur

Work from Office

Primine Software Private Limited is looking for BigData Engineer to join our dynamic team and embark on a rewarding career journey Develop and maintain big data solutions. Collaborate with data teams and stakeholders. Conduct data analysis and processing. Ensure compliance with big data standards and best practices. Prepare and maintain big data documentation. Stay updated with big data trends and technologies.

Posted 1 month ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Hyderabad

Work from Office

We are looking for a skilled Hadoop Administrator with 5 to 7 years of experience in Hadoop Engineering, working on Python, Ansible, and DevOps methodologies. The ideal candidate will have extensive experience in CDPHDP Cluster and Server build, including Control nodes, Worker nodes, Edge nodes, and Data copy from cluster to cluster. Roles and Responsibility Design and implement scalable and efficient data processing systems using Hadoop technologies. Develop and maintain automation scripts using Python, Ansible, and other DevOps tools. Collaborate with cross-functional teams to identify and prioritize project requirements. Troubleshoot and resolve complex technical issues related to Hadoop clusters. Ensure high-quality standards for data processing and security. Participate in code reviews and contribute to the improvement of the overall codebase. Job Strong understanding of Hadoop ecosystem, including HDFS, MapReduce, and YARN. Experience with Linux operating system and scripting languages such as Bash or Python. Proficient in Shell scripting and YAML configuration files. Good technical design, problem-solving, and debugging skills. Understanding of CI/CD concepts and familiarity with GitHub, Jenkins, and Ansible. Hands-on development solutions using industry-leading Cloud technologies. Working knowledge of Git Ops and DevSecOps. Agile proficient and knowledgeable in other agile methodologies, ideally certified. Strong communication and networking skills. Ability to work autonomously and take accountability to execute and deliver on goals. Strong commitment to high-quality standards. Good communication skills and sense of ownership to work as an individual contributor.

Posted 1 month ago

Apply

4.0 - 8.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Reference 2500085G Responsibilities Python/Flask & Devops Developer with Big Data Knowledge Role: We are seeking a skilled Python Developer with basic Big Data knowledge and expertise in REST API development using Flask The candidate will join our team to take over the development, continuous improvement, and support of our Self-service tool based on the On-Premise Big Data Data Lake Responsibilities: Develop and maintain our application using Python and Flask Implement and improve CI/CD pipelines using Jenkins, Sonar, Git, Docker, and Kubernetes Collaborate with the team to ensure the proper functioning and optimization of the Self-service tool Support and enhance the tool to meet project needs and ensure stability under heavy load Utilize Big Data technologies (HDFS, Spark, Oozie) as needed Required Profile required Experience: 3-5 Years Skills: Proven experience in REST API development using Python and Flask Basic/Intermediate knowledge on Ansible and AWX Basic knowledge of Big Data technologies (Cloudera stack: HDFS, Hive, YARN, Kafka, HBase) Experience with CI/CD tools and practices (Jenkins, Sonar, Git, Docker, Kubernetes) Strong problem-solving skills and ability to work collaboratively in a team environment Excellent communication skills Why join us "We are committed to creating a diverse environment and are proud to be an equal opportunity employer All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status? Business insight At SocitGnrale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious Whether youre joining us for a period of months, years or your entire career, together we can have a positive impact on the future Creating, daring, innovating, and taking action are part of our DNA If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us! Still hesitating You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices, and sharing their skills with charities There are many ways to get involved We are committed to support accelerating our Groups ESG strategy by implementing ESG principles in all our activities and policies They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection Diversity and Inclusion We are an equal opportunities employer and we are proud to make diversity a strength for our company Societe Generale is committed to recognizing and promoting all talents, regardless of their beliefs, age, disability, parental status, ethnic origin, nationality, gender identity, sexual orientation, membership of a political, religious, trade union or minority organisation, or any other characteristic that could be subject to discrimination

Posted 1 month ago

Apply

7.0 - 12.0 years

6 - 7 Lacs

Banswara

Work from Office

Focusing on polyester and viscose dyed yarns,responsible for effective production planning, ensuring smooth workflow, and maintaining top-quality standards in our yarn operations. Proficiency in production management software SAP and MS Office

Posted 1 month ago

Apply

10.0 - 12.0 years

8 - 12 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Senior UI Developer with extensive experience in React to lead the design and implementation of advanced user interfaces for our web applications. The ideal candidate will have a strong background in JavaScript, modern frontend frameworks, and a passion for creating seamless, high-performance user experiences. Role Responsibilities Design and implement user interface components for web applications using React. Collaborate with product managers, designers, and backend developers to deliver cohesive, user-friendly solutions. Optimize applications for maximum speed, scalability, and responsiveness. Maintain and improve code quality, organization, and automation. Conduct code reviews and mentor junior developers. Diagnose and fix bugs in existing React applications. Integrate frontend components with backend services and APIs. Stay up-to-date with the latest industry trends and technologies. Establish and enforce UI standards and best practices. Ensure the technical feasibility of UI/UX designs. Skills and requirements: 10-12 years of professional experience in frontend development. Expertise in React.js, including core principles, state management (Redux), routing, and component-based architecture. Strong proficiency in JavaScript (ES6+), HTML5, and CSS3. Experience with modern JavaScript build tools such as Webpack, Babel, and npm/yarn. Familiarity with RESTful APIs and integration with backend services. Experience with automated testing frameworks (Jest, Mocha, etc) and writing unit/integration tests. Proficiency with version control systems (Git). Understanding of server-side rendering and its benefits (Next.js). Strong problem-solving skills and a keen eye for detail. Excellent communication and collaboration skills. Ability to work independently and as part of a cross-functional team. Preferred Qualifications Experience with TypeScript. Knowledge of additional frontend frameworks is a plus. Experience in mentoring and leading frontend teams. Familiarity with UI/UX design principles and tools. Experience in optimizing applications for accessibility and internationalization

Posted 1 month ago

Apply

3.0 - 6.0 years

16 - 18 Lacs

Gurugram

Work from Office

Most companies try to meet expectations, dunnhumby exists to defy them. Using big data, deep expertise and AI-driven platforms to decode the 21 st century human experience - then redefine it in meaningful and surprising ways that put customers first. Across digital, mobile and retail. For brands like Tesco, Coca-Cola, Procter & Gamble and PepsiCo. We re looking for a Big Data Engineer to join our dh strategic team of Loyality and Personalisation which builds products the retailer can use to find the optimal customer segments and send personalised offers and digital recommendations to the consumer. These products are strategic assets to retailer to improve the loyality of their consumer. by that these products are very important for retailers and therefore for dunnhumby What we expect from you: - 3 to 6 years of experience in software development using Python. - Hands on experience in Python OOPS, Design patterns, Dependency Injection, data libraries(Panda), data structures - Exposure to Spark: PySpark, Architecture of Spark, Best practices to optimize jobs - Experience in Hadoop ecosystem: HDFS, Hive, or YARN - Experience of Orchestration tools: Airflow, Argo workflows, Kubernetes - Experience on Cloud native services(GCP/Azure/AWS) preferable GCP - Database knowledge of: SQL, NoSQL - Hands on expsoure to CI/CD pipelines for data engineering workflows - Testing: pytest for unit testing. pytest-spark to create a test Spark Session. Spark UI for Performance tuning & monitoring Good to have: - Scala What you can expect from us We won t just meet your expectations. We ll defy them. So you ll enjoy the comprehensive rewards package you d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don t just talk about diversity and inclusion. We live it every day - with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One and dh Thrive as the living proof. Everyone s invited. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here) What you can expect from us We won t just meet your expectations. We ll defy them. So you ll enjoy the comprehensive rewards package you d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don t just talk about diversity and inclusion. We live it every day - with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One, dh Enabled and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here)

Posted 1 month ago

Apply

4.0 - 9.0 years

9 - 13 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Krazy Mantra Group of Companies is looking for Big Data Engineer to join our dynamic team and embark on a rewarding career journeyDesigning and implementing scalable data storage solutions, such as Hadoop and NoSQL databases.Developing and maintaining big data processing pipelines using tools such as Apache Spark and Apache Storm.Writing and testing data processing scripts using languages such as Python and Scala.Integrating big data solutions with other IT systems and data sources.Collaborating with data scientists and business stakeholders to understand data requirements and identify opportunities for data-driven decision making.Ensuring the security and privacy of sensitive data.Monitoring performance and optimizing big data systems to ensure they meet performance and availability requirements.Staying up-to-date with emerging technologies and trends in big data and data engineering.Mentoring junior team members and providing technical guidance as needed.Documenting and communicating technical designs, solutions, and best practices.Strong problem-solving and debugging skillsExcellent written and verbal communication skills

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Chennai

Work from Office

We are looking for a skilled Hadoop Developer with 3 to 6 years of experience to join our team at IDESLABS PRIVATE LIMITED. The ideal candidate will have expertise in developing and implementing big data solutions using Hadoop technologies. Roles and Responsibility Design, develop, and deploy scalable big data applications using Hadoop. Collaborate with cross-functional teams to identify business requirements and develop solutions. Develop and maintain large-scale data processing systems using Hadoop MapReduce. Troubleshoot and optimize performance issues in existing Hadoop applications. Participate in code reviews to ensure high-quality code standards. Stay updated with the latest trends and technologies in big data development. Job Requirements Strong understanding of Hadoop ecosystem including HDFS, YARN, and Oozie. Experience with programming languages such as Java or Python. Knowledge of database management systems such as MySQL or NoSQL. Familiarity with agile development methodologies and version control systems like Git. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment and communicate effectively with stakeholders.

Posted 1 month ago

Apply

3.0 - 8.0 years

11 - 16 Lacs

Bengaluru

Work from Office

As a Data Engineer , you are required to Design, build, and maintain data pipelines that efficiently process and transport data from various sources to storage systems or processing environments while ensuring data integrity, consistency, and accuracy across the entire data pipeline. Integrate data from different systems, often involving data cleaning, transformation (ETL), and validation. Design the structure of databases and data storage systems, including the design of schemas, tables, and relationships between datasets to enable efficient querying. Work closely with data scientists, analysts, and other stakeholders to understand their data needs and ensure that the data is structured in a way that makes it accessible and usable. Stay up-to-date with the latest trends and technologies in the data engineering space, such as new data storage solutions, processing frameworks, and cloud technologies. Evaluate and implement new tools to improve data engineering processes. Qualification Bachelor's or Master's in Computer Science & Engineering, or equivalent. Professional Degree in Data Science, Engineering is desirable. Experience level At least3- 5years hands-on experience in Data Engineering Desired Knowledge & Experience Spark: Spark 3.x, RDD/DataFrames/SQL, Batch/Structured Streaming Knowing Spark internalsCatalyst/Tungsten/Photon Databricks: Workflows, SQL Warehouses/Endpoints, DLT, Pipelines, Unity, Autoloader IDE: IntelliJ/Pycharm, Git, Azure Devops, Github Copilot Test: pytest, Great Expectations CI/CD Yaml Azure Pipelines, Continuous Delivery, Acceptance Testing Big Data Design: Lakehouse/Medallion Architecture, Parquet/Delta, Partitioning, Distribution, Data Skew, Compaction Languages: Python/Functional Programming (FP) SQL TSQL/Spark SQL/HiveQL Storage Data Lake and Big Data Storage Design additionally it is helpful to know basics of: Data Pipelines ADF/Synapse Pipelines/Oozie/Airflow Languages: Scala, Java NoSQL :Cosmos, Mongo, Cassandra Cubes SSAS (ROLAP, HOLAP, MOLAP), AAS, Tabular Model SQL Server TSQL, Stored Procedures Hadoop HDInsight/MapReduce/HDFS/YARN/Oozie/Hive/HBase/Ambari/Ranger/Atlas/Kafka Data Catalog Azure Purview, Apache Atlas, Informatica Required Soft skills & Other Capabilities Great attention to detail and good analytical abilities. Good planning and organizational skills Collaborative approach to sharing ideas and finding solutions Ability to work independently and also in a global team environment.

Posted 1 month ago

Apply

4.0 - 5.0 years

10 - 15 Lacs

Pune

Work from Office

Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make an outstanding addition to our vibrant team. Siemens Mobility is an independent run company of Siemens AG. Its core business includes rail vehicles, rail automation and electrification solutions, turnkey systems, intelligent road traffic technology and related services. In Mobility, we help our customers meet the need for hard-working mobility solutions. Were making the lives of people who travel easier and more enjoyable while constantly developing new, intelligent mobility solutions! We are looking forEmbedded Linux Engineer- Train IT Youll make a difference by You will be part of the Engineering team for new and exciting software applications in our trains. Your mission will be to customize Linux image of our Train IT platform for specific train and integrate applications such as train server, train to ground communication, passenger information, passenger counting or CCTV. This role requires a wide range of technical skills and a desire to find out how things work and why. Be a member of the international engineering team Configure and customize Debian Linux image for deployment to the train Customize applications and configure devices such as network switches and special devices according to the system architecture of the train Integrate these applications and devices with other systems in the train Cooperate with software test team Provide technical support in your area of expertise Desired Skills: Minimum 4-5 years of Experience in software development. Experience with Linux as power user or administrator Experience with configuration of managed switches Good knowledge of TCP/IP Understanding of network protocols like DHCP, RADIUS, DNS, multicast, SSL/TLS Experience with issue tracking tools such as JIRA or Redmine Highly organized and self-motivated Hands-on, problem-solving mentality Experience in the railway industry. Long term interest in the IT domain, passion for IT German language Python programming Fluent English Join us and be yourself! Make your mark in our exciting world at Siemens. This role is based in Pune. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come. Find out more about mobility athttps://new.siemens.com/global/en/products/mobility.html and about Siemens careers at

Posted 1 month ago

Apply

4.0 - 9.0 years

8 - 13 Lacs

Hyderabad

Work from Office

We are looking for a PySpark solutions developer and data engineer who can design and build solutions for one of our Fortune 500 Client programs, which aims towards building a data standardized and curation needs on Hadoop cluster. This is high visibility, fast-paced key initiative will integrate data across internal and external sources, provide analytical insights, and integrate with the customers critical systems. Key Responsibilities Ability to design, build and unit test applications on Spark framework on Python. Build PySpark based applications for both batch and streaming requirements, which will require in-depth knowledge on majority of Hadoop and NoSQL databases as well. Develop and execute data pipeline testing processes and validate business rules and policies. Optimize performance of the built Spark applications in Hadoop using configurations around Spark Context, Spark-SQL, Data Frame, and Pair RDD's. Optimize performance for data access requirements by choosing the appropriate native Hadoop file formats (Avro, Parquet, ORC etc) and compression codec respectively. Build integrated solutions leveraging Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types, HDFS compression codec. Build data tokenization libraries and integrate with Hive & Spark for column-level obfuscation. Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources. Create and maintain integration and regression testing framework on Jenkins integrated with Bit Bucket and/or GIT repositories. Participate in the agile development process, and document and communicate issues and bugs relative to data standards in scrum meetings. Work collaboratively with onsite and offshore team. Develop & review technical documentation for artifacts delivered. Ability to solve complex data-driven scenarios and triage towards defects and production issues. Ability to learn-unlearn-relearn concepts with an open and analytical mindset. Participate in code release and production deployment. Challenge and inspire team members to achieve business results in a fast paced and quickly changing environment. Preferred Qualifications BE/B.Tech/ B.Sc. in Computer Science/ Statistics from an accredited college or university. Minimum 3 years of extensive experience in design, build and deployment of PySpark-based applications. Expertise in handling complex large-scale Big Data environments preferably (20Tb+). Minimum 3 years of experience in the followingHIVE, YARN, HDFS. Hands-on experience writing complex SQL queries, exporting, and importing large amounts of data using utilities. Ability to build abstracted, modularized reusable code components. Prior experience on ETL tools preferably Informatica PowerCenter is advantageous. Able to quickly adapt and learn. Able to jump into an ambiguous situation and take the lead on resolution. Able to communicate and coordinate across various teams. Are comfortable tackling new challenges and new ways of working Are ready to move from traditional methods and adapt into agile ones Comfortable challenging your peers and leadership team. Can prove yourself quickly and decisively. Excellent communication skills and Good Customer Centricity. Strong Target & High Solution Orientation.

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Bigdata Spark, scala, hive, kafka Preferred Skills: Technology-Big Data-Hbase Technology-Big Data-Sqoop Technology-Functional Programming-Scala Technology-Big Data - Data Processing-Spark-SparkSQL

Posted 1 month ago

Apply

8.0 - 13.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Experience in SQL and understanding of ETL best practices Should have good hands on in ETL/Big Data development Extensive hands on experience in Scala Should have experience in Spark/Yarn, troubleshooting Spark, Linux, Python Setting up a Hadoop cluster, Backup, recovery, and maintenance.

Posted 1 month ago

Apply

6.0 - 10.0 years

5 - 10 Lacs

Itanagar

Work from Office

Employment Type : Contract. Role Overview. Seeking a highly skilled Senior Frontend Developer with strong expertise in ReactJS and Micro Frontend architecture. This role is ideal for developers who are passionate about building scalable, modular, and high-performing web applications with modern front-end technologies.. Key Responsibilities. Design, develop, and maintain user-focused web applications using React.js.. Implement and maintain Micro Frontend architecture for modular and scalable applications.. Create reusable UI components and libraries for consistent and efficient development.. Optimize web applications for performance, scalability, and responsiveness.. Collaborate with UI/UX designers to convert designs into pixel-perfect interfaces.. Ensure cross-browser compatibility and seamless performance on different devices.. Integrate RESTful APIs and backend services in coordination with backend developers.. Perform code reviews, debug complex issues, and contribute to performance tuning.. Keep up to date with the latest industry trends, tools, and technologies in front-end development.. Required Qualifications & Skills. 5+ years of Frontend Development experience with at least 3 years in ReactJS.. Proven experience with Micro Frontend implementation and architecture.. Strong knowledge of JavaScript (ES6+), HTML5, CSS3, and modern styling techniques (e.g., SASS, Styled. Components).. Experience with state management tools like Redux, Zustand, or Context API.. Familiarity with build tools (Webpack, Babel) and package managers (NPM, Yarn).. Understanding of responsive web design principles and techniques.. Solid experience with version control systems (e.g., Git).. Familiarity with testing libraries (Jest, React Testing Library) and CI/CD pipelines.. Strong debugging and performance optimization skills.. Excellent communication and team collaboration abilities.. Nice To Have. Experience with TypeScript.. Exposure to containerization and deployment tools (e.g., Docker, Kubernetes).. Experience with cloud services (AWS, Azure, GCP).. Familiarity with Agile methodologies and tools like Jira or Trello.. (ref:hirist.tech). Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Immediate Openings on Big data engineer/Developer _ Pan India_Contract Experience 5+ Years Skills Big data engineer/Developer Location Pan India Notice Period Immediate . Employment Type Contract Working Mode Hybrid Big data engineer/Developer Spark-Scala HQL, Hive Control-m Jinkins Git Technical analysis and up to some extent business analysis (knowledge about banking products, credit cards and its transactions)

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Chennai

Work from Office

Design and implement Big Data solutions using Hadoop and MapR ecosystem. You will work with data processing frameworks like Hive, Pig, and MapReduce to manage and analyze large data sets. Expertise in Hadoop and MapR is required.

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Develops data processing solutions using Scala and PySpark.

Posted 1 month ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Hyderabad

Work from Office

Specializes in Public Key Infrastructure (PKI) implementation and certificate management. Responsibilities include configuring digital certificates, managing encryption protocols, and ensuring secure communication channels. Expertise in SSL/TLS, HSMs, and identity management is required.

Posted 1 month ago

Apply

6.0 - 11.0 years

10 - 14 Lacs

Hyderabad, Pune, Chennai

Work from Office

Job type: contract to hire 10+ years of software development experience building large scale distributed data processing systems/application, Data Engineering or large scale internet systems. Experience of at least 4 years in Developing/ Leading Big Data solution at enterprise scale with at least one end to end implementation Strong experience in programming languages Java/J2EE/Scala. Good experience in Spark/Hadoop/HDFS Architecture, YARN, Confluent Kafka , Hbase, Hive, Impala and NoSQL database. Experience with Batch Processing and AutoSys Job Scheduling and Monitoring Performance analysis, troubleshooting and resolution (this includes familiarity and investigation of Cloudera/Hadoop logs) Work with Cloudera on open issues that would result in cluster configuration changes and then implement as needed Strong experience with databases such as SQL,Hive, Elasticsearch, HBase, etc Knowledge of Hadoop Security, Data Management and Governance Primary Skills: Java/Scala, ETL, Spark, Hadoop, Hive, Impala, Sqoop, HBase, Confluent Kafka, Oracle, Linux, Git, Jenkins CI/CD

Posted 1 month ago

Apply

6.0 - 8.0 years

25 - 30 Lacs

Bengaluru

Work from Office

6+ years of experience in information technology, Minimum of 3-5 years of experience in managing and administering Hadoop/Cloudera environments. Cloudera CDP (Cloudera Data Platform), Cloudera Manager, and related tools. Hadoop ecosystem components (HDFS, YARN, Hive, HBase, Spark, Impala, etc.). Linux system administration with experience with scripting languages (Python, Bash, etc.) and configuration management tools (Ansible, Puppet, etc.) Tools like Kerberos, Ranger, Sentry), Docker, Kubernetes, Jenkins Cloudera Certified Administrator for Apache Hadoop (CCAH) or similar certification. Cluster Management, Optimization, Best practice implementation, collaboration and support.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies