Jobs
Interviews

552 Hbase Jobs - Page 22

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1 - 4 years

5 - 9 Lacs

Pune

Work from Office

About PhonePe Group: PhonePe is Indias leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech-enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! Challenges Building for Scale, Rapid Iterative Development, and Customer-centric Product Thinking at each step defines every day for a developer at PhonePe. Though we engineer for a 50 million+ strong user base, we code with every individual user in mind. While we are quick to adopt the latest in Engineering, we care utmost for security, stability, and automation. Apply if you want to experience the best combination of passionate application development and product-driven thinking. Role & Responsibilities Build Robust and scalable web-based applications. You will need to think of platforms & reuse. Build abstractions and contracts with separation of concerns for a larger scope. Drive problem-solving skills for high-level business and technical problems. Do high-level design with guidance; Functional modelling, break-down of a module. Do incremental changes to architectureimpact analysis of the same. Do performance tuning and improvements in large scale distributed systems. Mentor young minds and foster team spirit, break down execution into phases to bring predictability to overall execution. Work closely with Product Manager to derive capability views from features/solutions, Lead execution of medium-sized projects. Work with broader stakeholders to track the impact of projects/features and proactively iterate to improve them. Requirements- 5 years of experience in the art of writing code and solving problems on a Large Scale (FinTech experience preferred). B.Tech, M.Tech, or Ph.D. in Computer Science or related technical discipline (or equivalent). Excellent coding skills should be able to convert the design into code fluently. Experience in at least one general programming language (e.g. Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code. Experience with multi-threading, concurrency programming, object-oriented design skills, knowledge of design patterns, and huge passion and ability to design intuitive modules, class-level interfaces and knowledge of Test driven development. Good understanding of databases (e.g. MySQL) and NoSQL (e.g. HBase, Elasticsearch, Aerospike, etc). Experience in full life cycle development in any programming language on a Linux platform and building highly scalable business applications, which involve implementing large complex business flows and dealing with a huge amount of data. Strong desire for solving complex and interesting real-world problems. Go-getter attitude that reflects in energy and intent behind assigned tasks An open communicator who shares thoughts and opinions frequently listens intently and takes constructive feedback. Ability to drive the design and architecture of multiple subsystems. Ability to break-down larger/fuzzier problems into smaller ones in the scope of the product Understanding of the industrys coding standards and an ability to create appropriate technical documentation. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe .

Posted 2 months ago

Apply

4 - 7 years

5 - 9 Lacs

Bengaluru

Work from Office

About PhonePe Group: PhonePe is Indias leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech-enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! Challenges Building for Scale, Rapid Iterative Development, and Customer-centric Product Thinking at each step defines every day for a developer at PhonePe. Though we engineer for a 50million+ strong user base, we code with every individual user in mind. While we are quick to adopt the latest in Engineering, we care utmost for security, stability, and automation. Apply if you want to experience the best combination of passionate application development and product-driven thinking As a Software Engineer: ? You will build Robust and scalable web-based applications You will need to think of platforms & reuse ? Build abstractions and contracts with separation of concerns for a larger scope ? Drive problem-solving skills for high-level business and technical problems. ? Do high-level design with guidance; Functional modeling, break-down of a module ? Do incremental changes to architectureimpact analysis of the same ? Do performance tuning and improvements in large scale distributed systems ? Mentor young minds and foster team spirit, break down execution into phases to bring predictability to overall execution ? Work closely with Product Manager to derive capability view from features/solutions, Lead execution of medium-sized projects ? Work with broader stakeholders to track the impact of projects/features and proactively iterate to improve them As a senior software engineer you must have ? Extensive and expert programming experience in at least one general programming language (e.g. Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code. ? Experience with multi-threading and concurrency programming ? Extensive experience in object-oriented design skills, knowledge of design patterns, and huge passion and ability to design intuitive module and class-level interfaces ? Excellent coding skills should be able to convert the design into code fluently ? Knowledge of Test Driven Development ? Good understanding of databases (e.g. MySQL) and NoSQL (e.g. HBase, Elasticsearch, Aerospike, etc) ? Strong desire to solving complex and interesting real-world problems ? Experience with full life cycle development in any programming language on a Linux platform ? Go-getter attitude that reflects in energy and intent behind assigned tasks ? Worked in a startups environment with high levels of ownership and commitment ? BTech, MTech, or Ph.D. in Computer Science or related technical discipline (or equivalent). ? Experience in building highly scalable business applications, which involve implementing large complex business flows and dealing with a huge amount of data. ? 4-7 years of experience in the art of writing code and solving problems on a Large Scale. ? An open communicator who shares thoughts and opinions frequently listens intently and takes constructive feedback. As a Software Engineer, good to have ? The ability to drive the design and architecture of multiple subsystems ? Ability to break-down larger/fuzzier problems into smaller ones in the scope of the product ? Understanding of the industrys coding standards and an ability to create appropriate technical documentation. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe .

Posted 2 months ago

Apply

7 - 11 years

30 - 37 Lacs

Pune

Remote

Build scalable Python, Django and Golang apps Develop reusable APIs and components Integrate REST/SOAP/streaming services Optimize performance and reliability Review code, mentor juniors, and work in Agile teams Required Candidate profile 7–10 yrs in software development 4+ yr in Python,Django&Golang Strong in MySQL, REST APIs, Git, Kafka Exp. with RabbitMQ, Redis Solid in Agile, testing, backend design Must be individual contributor

Posted 2 months ago

Apply

4 - 8 years

10 - 14 Lacs

Bengaluru

Work from Office

About the role: We are seeking a highly skilled Domain Expert in Condition Monitoring to join our team and play a pivotal role in advancing predictive maintenance strategies for electrical equipment. This position focuses on leveraging cutting-edge machine learning and data analytics techniques to design and implement scalable solutions that optimize maintenance processes, enhance equipment reliability, and support operational efficiency. As part of this role, you will apply your expertise in predictive modeling, supervised and unsupervised learning, and advanced data analysis to uncover actionable insights from high-dimensional datasets. You will collaborate with cross-functional teams to translate business requirements into data-driven solutions that surpass customer expectations. If you have a passion for innovation and sustainability in the industrial domain, this is an opportunity to make a meaningful impact. Key Responsibilities: Develop and implement predictive maintenance models using a variety of supervised and unsupervised learning techniques. Analyze high-dimensional datasets to identify patterns and correlations that can inform maintenance strategies. Utilize linear methods for regression and classification, as well as advanced techniques such as splines, wavelets, and kernel methods. Conduct model assessment and selection, focusing on bias, variance, overfitting, and cross-validation. Apply ensemble learning techniques, including Random Forest and Boosting, to improve model accuracy and robustness. Implement structured methods for supervised learning, including additive models, trees, neural networks, and support vector machines. Explore unsupervised learning methods such as cluster analysis, principal component analysis, and self-organizing maps to uncover insights from data. Engage in directed and undirected graph modeling to represent and analyze complex relationships within the data. Collaborate with cross-functional teams to translate business requirements into data-driven solutions. Communicate findings and insights to stakeholders, providing actionable recommendations for maintenance optimization. Mandatory Requirements: Master"™s degree or Ph.D. in Data Science, Statistics, Computer Science, Engineering, or a related field. Proven experience in predictive modeling and machine learning, particularly in the context of predictive maintenance. Strong programming skills in languages such as Python, R, or similar, with experience in relevant libraries (e.g., scikit-learn, TensorFlow, Keras). Familiarity with data visualization tools and techniques to effectively communicate complex data insights. Experience with big data technologies and frameworks (e.g., Hadoop, Spark) is a plus. Excellent problem-solving skills and the ability to work independently as well as part of a team. Strong communication skills, with the ability to convey technical concepts to non-technical stakeholders. Good to Have: Experience in Industrial software & Enterprise solutions Preferred Skills & Attributes: Strong understanding of modern software architectures and DevOps principles. Ability to analyze complex problems and develop effective solutions. Excellent communication and teamwork skills, with experience in cross-functional collaboration. Self-motivated and capable of working independently on complex projects. About the Team Become a part of our mission for sustainabilityclean energy for generations to come. We are a global team of diverse colleagues who share a passion for renewable energy and have a culture of trust and empowerment to make our own ideas a reality. We focus on personal and professional development to grow internally within our organization. Who is Siemens Energy? At Siemens Energy, we are more than just an energy technology company. We meet the growing energy demand across 90+ countries while ensuring our climate is protected. With more than 96,000 dedicated employees, we not only generate electricity for over 16% of the global community, but we"™re also using our technology to help protect people and the environment. Our global team is committed to making sustainable, reliable, and affordable energy a reality by pushing the boundaries of what is possible. We uphold a 150-year legacy of innovation that encourages our search for people who will support our focus on decarbonization, new technologies, and energy transformation.

Posted 2 months ago

Apply

7 - 12 years

35 - 40 Lacs

Pune

Work from Office

About The Role : Job TitleBusiness Functional Analyst (Analytics) Corporate TitleVice President LocationPune, India Role Description ERM (Enterprise Risk Management) & MVRM (Market & Valuation Risk Management) IT group are part of Technology Data and Innovation and own and deliver on the RiskFinder platform to multiple stakeholders and sponsors. RiskFinder is the Banks Risk & Capital Management platform. It provides capability to calculate capital metrics, performs risk scenario analysis and portfolio risk analytics and related control functions across the Banks business lines.The system calculates over 600 billion scenarios per day on a high-performance compute grid, stores the results into a big data store and provides our end users the capability to aggregate, report and analyse the results. RiskFinder integrates distributed high performance grid compute and big data technologies to deliver the execution and analytics at very large scale required to process the volumes of scenarios within the timeframes required. The platform leverages in house quantitative analytics and inputs to our front office pricing models to deliver full revaluation-based capital metrics across a complex derivates portfolio. Our technology stack includes Java, C, C++, PostGres, OracleDB, Lua, Python, Scala, and Spark plus other off-the-shelf products like caching solutions integrated into one platform, which offers great opportunity for technical development and personal growth in a domain with focus on engineering and Agile delivery practices. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Develop a sound knowledge of the business requirements around market and credit risk calculations to be implemented in the strategic risk platform. Liaise with the key stakeholders to understand and document business requirements for the strategic risk platform Collaborate with business representatives, product leads to define optimal system solutions to meet business requirements Continuously improve data visualisation, dashboard and reporting capabilities Drive the breakdown and prioritization of the system deliverables across applications that make up the strategic risk analytical platform. Provide subject matter expertise to the development teams to convey business objectives of requirements and help make decisions on implementation specifics Your skills and experience Excellent business knowledge esp. Market and Counterparty Risk processes and methodologies, Regulatory RWA calculations and reporting, Derivatives pricing and risk management Strong Business analysis and problem-solving skills. Effective communication and presentation skills Exposure to software development lifecycle methodologies (waterfall, Agile etc) Data analysis, use of databases and data modelling. Working Knowledge of SQL, python, Pyspark or any similar tools for data analysis/drill down capability is MUST. Prior experience of leading a team by example would be highly beneficial Experience in product management, building product backlog, understanding and executing roadmap How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.

Posted 2 months ago

Apply

5 - 8 years

25 - 30 Lacs

Hyderabad

Work from Office

Ecodel Infotel pvt ltd is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey. Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 months ago

Apply

5 - 7 years

0 - 0 Lacs

Thiruvananthapuram

Work from Office

Key Responsibilities: Big Data Architecture: Design, develop, and maintain scalable and distributed data architectures capable of processing large volumes of data. Data Storage Solutions: Implement and optimize data storage solutions using technologies such as Hadoop , Spark , and PySpark . PySpark Development: Develop and implement efficient ETL processes using PySpark to extract, transform, and load large datasets. Performance Optimization: Optimize PySpark applications for better performance, scalability, and resource management. Qualifications: Proven experience as a Big Data Engineer with a strong focus on PySpark . Deep understanding of Big Data processing frameworks and technologies. Strong proficiency in PySpark for developing and optimizing ETL processes and data transformations. Experience with distributed computing and parallel processing. Ability to collaborate in a fast-paced, innovative environment. Required Skills Pyspark,Big Data, Python,

Posted 2 months ago

Apply

6 - 10 years

8 - 12 Lacs

Bengaluru

Work from Office

When you join Verizon You want more out of a career. A place to share your ideas freely even if theyre daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the V Team Life. What youll be doing... Turn ideas into innovative products with design, development, deployment and support throughout the product/software development life cycle. Architect and Develop key software components of high-quality products. Architecting and expanding a GNSS network related offerings for the Verizon Location Services products. Participate in requirement gathering, idea validation, and concept prototyping. Design end and end solutions to bring ideas into innovative products. Refine product designs to provide an excellent user experience. Develop/code key software components of products. Integrate key software components with various systems like Thingspace, Frisco, Gizmo, Smart Home, etc. Work with system engineers to create system/network designs and architecture. Present technical product information to internal audience and executives Work with performance engineers to refine software design and codes to improve performance and capacity. Use agile and iterative methods to demo product features and refine the user experience. What were looking for... Youll need to have: Bachelor's degree of six or more years of work experience. Experience in developing software products. Experience working on GNSS technology (RTK, PPP , DGPS, GPS) Experience working with RTKLIB greatly desired Experience with agile software development. Advanced knowledge of application, data and infrastructure architecture disciplines. Understanding of architecture and design across all systems. Experience with Java/J2EE, Springboot/MVC, JMS & Kafka. Designing and developing APIs. Knowledge of Database (Oracle), Linux/Unix, NOSQL DB (e.g. MongoDB, HBase) Knowledge of Microservice Architecture, Cloud Computing, Docker Containers, Restful API, EKS. Familiarity with developing and deploying services in AWS. Knowledge of Object-Oriented Design, Agile Scrum, Test Driven Development. Good communication skills and ability to present technical information in a clear and concise manner Even better if you have: Good communication skills and ability to present technical information in a clear and concise manner. Experience working with RINEX, RTCM formatted data. #TPDNONCDIO Where youll be workingIn this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager.Scheduled Weekly Hours40Diversity and Inclusion Were proud to be an equal opportunity employer. At Verizon, we know that diversity makes us stronger. We are committed to a collaborative, inclusive environment that encourages authenticity and fosters a sense of belonging. We strive for everyone to feel valued, connected, and empowered to reach their potential and contribute their best. Check out our page to learn more.

Posted 2 months ago

Apply

- 3 years

2 - 5 Lacs

Bengaluru

Work from Office

JOB DESCRIPTION We are looking for dedicated, curious, and energetic Software Engineers for Authorize.net in Bengaluru, India who embrace solving complex challenges on a global scale. As a Authorize.net Software Engineer, you will be an integral part of a multi-functional development team inventing, designing, building, testing and deploying software products that reach a truly global customer base. While building components of innovative payment technology, you will get to see your efforts shaping the digital future of monetary transactions. The Work itself: Collaborate cross-functionally to create design artifacts and develop best-in-class software solutions for multiple Visa technical offerings Contribute to product quality improvements, valuable service technology, and new business flows in diverse agile squads Develop robust and scalable products intended for a myriad of customers including end-user merchants, b2b, and business to government solutions Leverage innovative technologies to build the next generation of Payment Services, Transaction Platforms, Real-Time Payments, and Buy Now Pay Later Technology Opportunities to make a difference on a global or local scale through mentorship and continued learning opportunities Modernize our systems and deliver innovative online payment solutions Enable process improvements through robust DevOps practices, incorporating comprehensive release management strategies and optimized CI/CD pipelines. Essential Functions: Supports relationship with product owners to gather and refine requirements for one product task, adding and taking into account existing tools and solutions within a product. Begins to develop and design architect solutions, considering integrations with other solutions. Provides relevant knowledge on the development of user documentation of solutions and follows standard processes in user documentation. With support from team, plays a role in the development and delivery of new features within a product. The Skills You Bring: Energy and Experience: A growth mindset that is curious and passionate about technologies and enjoys challenging projects on a global scale Challenge the Status Quo: Comfort in pushing the boundaries, hacking beyond traditional solutions Builder: Experience building and deploying modern services and web applications with quality and scalability Learner: Constant drive to learn new technologies such as Angular, React, Kubernetes, Docker, etc. Partnership: Experience collaborating with Product, Test, Dev-ops, and Agile/Scrum teams Ownership: Taking full responsibility for tasks, from planning through execution, ensuring high-quality results Basic Qualifications Bachelor s degree or an Advanced degree in Computer Science, Information Systems or related field. 6 months to 18 months of work experience in one of Frontend or Backend or Full stack is required Hands on knowledge in one or more programming language or technology including, but not limited, to C#, Java, .NET, JavaScript, CSS, React, building RESTful APIs Knowledge of data structures, which consist of data organization, management, and storage formats that enable efficient access and modifications Good data analytical mindset to understand the data and build great products Experience working within a SCRUM model Experience with n-tier web application development and experience Prior exposure to SQL &/or NoSQL data stores (e.g., HBase, Cassandra) is beneficial Hands-on knowledge of Microservices, containers, cloud platforms (e.g., AWS, Azure, or GCP) is a plus Experience in using GenAI tools in SDLC is a plus Experience with merchant data or payment technology is a plus

Posted 2 months ago

Apply

5 - 7 years

0 - 0 Lacs

Hyderabad

Work from Office

Senior Big Data Engineer Experience: 7-9 Years of Experience. Preferred location: Hyderabad Must have Skills: Bigdata, AWS cloud, Java/Scala/Python, Ci/CD Good to Have Skills: Relational Databases (any), No SQL databases (any), Microservices or Domain services or API gateways or similar, Containers (Docker, K8s, etc) Required Skills Big Data,Aws Cloud,CI/CD,Java/Scala/Python

Posted 2 months ago

Apply

2 - 5 years

6 - 10 Lacs

Gurugram

Work from Office

KDataScience (USA & INDIA) is looking for Data Engineer to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 months ago

Apply

5 - 10 years

15 - 20 Lacs

Bengaluru

Work from Office

locationsIndia, Bangalore time typeFull time posted onPosted 30+ Days Ago job requisition idJR0034276 Job Title: Big Data Architect About Skyhigh Security Skyhigh Security is a dynamic, fast-paced, cloud company that is a leader in the security industry. Our mission is to protect the worlds data, and because of this, we live and breathe security. We value learning at our core, underpinned by openness and transparency. Since 2011, organizations have trusted us to provide them with a complete, market-leading security platform built on a modern cloud stack. Our industry-leading suite of products radically simplifies data security through easy-to-use, cloud-based, Zero Trust solutions that are managed in a single dashboard, powered by hundreds of employees across the world. With offices in Santa Clara, Aylesbury, Paderborn, Bengaluru, Sydney, Tokyo and more, our employees are the heart and soul of our company. Skyhigh Security Is more than a company; here, when you invest your career with us, we commit to investing in you. We embrace a hybrid work model, creating the flexibility and freedom you need from your work environment to reach your potential. From our employee recognition program, to our Blast Talks' learning series, and team celebrations (we love to have fun!), we strive to be an interactive and engaging place where you can be your authentic self. We are on these too! Follow us on and Twitter . Role Overview: The Big Data Architect will be responsible for the design, implementation, and management of the organizations big data infrastructure. The ideal candidate will have a strong technical background in big data technologies, excellent problem-solving skills, and the ability to work in a fast-paced environment. The role requires a deep understanding of data architecture, data modeling, and data integration techniques. About the Role: Design and implement scalable and efficient big data architecture solutions to meet business requirements. Develop and maintain data pipelines, ensuring the availability and quality of data. Collaborate with data scientists, data engineers, and other stakeholders to understand data needs and provide technical solutions. Lead the evaluation and selection of big data tools and technologies. Ensure data security and privacy compliance. Optimize and tune big data systems for performance and cost-efficiency. Document data architecture, data flows, and processes. Stay up-to-date with the latest industry trends and best practices in big data technologies. About You: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. over all 10+ years exp with 5+ years of experience in big data architecture and engineering. Proficiency in big data technologies such as Hadoop mapredue, Spark batch and streaming, Kafka, HBase, Scala, Elastic Search and others. Experience with AWS cloud platform. Strong knowledge of data modeling, ETL processes, and data warehousing. Proficiency in programming languages such as Java, Scala, Spark Familiarity with data visualization tools and techniques. Excellent communication and collaboration skills. Strong problem-solving abilities and attention to detail. Company Benefits and Perks: We work hard to embrace diversity and inclusion and encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement We're serious about our commitment to diversity which is why we prohibit discrimination based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.

Posted 2 months ago

Apply

5 - 9 years

9 - 14 Lacs

Bengaluru

Work from Office

locationsIndia, Bangalore time typeFull time posted onPosted 3 Days Ago job requisition idJR0035427 Job Title: Lead DevOps Engineer About Skyhigh Security Skyhigh Security is a dynamic, fast-paced, cloud company that is a leader in the security industry. Our mission is to protect the worlds data, and because of this, we live and breathe security. We value learning at our core, underpinned by openness and transparency. Since 2011, organizations have trusted us to provide them with a complete, market-leading security platform built on a modern cloud stack. Our industry-leading suite of products radically simplifies data security through easy-to-use, cloud-based, Zero Trust solutions that are managed in a single dashboard, powered by hundreds of employees across the world. With offices in Santa Clara, Aylesbury, Paderborn, Bengaluru, Sydney, Tokyo and more, our employees are the heart and soul of our company. Skyhigh Security Is more than a company; here, when you invest your career with us, we commit to investing in you. We embrace a hybrid work model, creating the flexibility and freedom you need from your work environment to reach your potential. From our employee recognition program, to our Blast Talks' learning series, and team celebrations (we love to have fun!), we strive to be an interactive and engaging place where you can be your authentic self. We are on these too! Follow us on and Twitter . Role Overview: We are hiring for DevOps Engineer who will improve and maintain software development, test and live infrastructure and services. About the Role You will work closely with the development teams, follow DevOps practices, automate infrastructure activities, documents standards and procedures. Drawing on your creative nature, you will drive the existing automation frameworks forward to benefit automation across multiple products, and may be asked to create new frameworks depending on need. Provide visibility on application health by defining dashboards, metric/log aggregation mechanisms. Coding using Python and AWS cloud formation - Mandatory Ability to learn to use new tools, and quickly become a subject matter expert. Improving site performance, monitoring and overall stability of our infrastructure. Evaluate new technologies and provide proof-of-concept Additional duties as assigned About you Strong knowledge of Linux systems administration and architecture. Experience with configuring, managing and supporting virtualized environments. Experience with continuous integration and deployment automation tools such as Jenkins, Salt, Puppet, Chef, Ansible. Experience with SQL (MySQL) and NoSQL databases (Redis). Experience with Kafka, Rabbitmq, Elastic Search Experience with Hadoop, HBase, Zookeeper, Oozie - Not mandatory. Extensive scripting experience with any of Python Experience supporting, analyzing and troubleshooting large-scale distributed mission critical systems. Systematic problem solving approach and strong sense of ownership to drive problems to resolution. Company Benefits and Perks: We believe that the best solutions are developed by teams who embrace each other's unique experiences, skills, and abilities. We work hard to create a dynamic workforce where we encourage everyone to bring their authentic selves to work every day. We offer a variety of social programs, flexible work hours and family-friendly benefits to all of our employees. Retirement Plans Medical, Dental and Vision Coverage Paid Time Off Paid Parental Leave Support for Community Involvement We're serious ab out our commitment to a workplace where everyone can thrive and contribute to our industry-leading products and customer support, which is why we prohibit discrimination and harassment based on race, color, religion, gender, national origin, age, disability, veteran status, marital status, pregnancy, gender expression or identity, sexual orientation or any other legally protected status.

Posted 2 months ago

Apply

9 - 11 years

37 - 40 Lacs

Ahmedabad, Bengaluru, Mumbai (All Areas)

Work from Office

Dear Candidate, We are hiring a Scala Developer to work on high-performance distributed systems, leveraging the power of functional and object-oriented paradigms. This role is perfect for engineers passionate about clean code, concurrency, and big data pipelines. Key Responsibilities: Build scalable backend services using Scala and the Play or Akka frameworks . Write concurrent and reactive code for high-throughput applications . Integrate with Kafka, Spark, or Hadoop for data processing. Ensure code quality through unit tests and property-based testing . Work with microservices, APIs, and cloud-native deployments. Required Skills & Qualifications: Proficient in Scala , with a strong grasp of functional programming Experience with Akka, Play, or Cats Familiarity with Big Data tools and RESTful API development Bonus: Experience with ZIO, Monix, or Slick Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 2 months ago

Apply

4 - 9 years

14 - 18 Lacs

Kochi

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 6 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in Data Bricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Hyderabad

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers AWS S3 , Redshift , and EMR for data storage and distributed processing. AWS Lambda , AWS Step Functions , and AWS Glue to build serverless, event-driven data workflows and orchestrate ETL processes

Posted 2 months ago

Apply

5 - 10 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Engineering Good to have skills : NA Minimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications function seamlessly to support business operations. You will engage in problem-solving activities and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Provide solutions to problems for their immediate team and across multiple teams. Facilitate knowledge sharing sessions to enhance team capabilities. Monitor project progress and ensure alignment with business objectives. Professional & Technical Skills: Must To Have Skills: Proficiency in Data Engineering. Strong understanding of data modeling and ETL processes. Experience with cloud platforms such as AWS or Azure. Familiarity with data warehousing solutions and big data technologies. Ability to work with various programming languages relevant to data engineering. Additional Information: The candidate should have minimum 5 years of experience in Data Engineering. This position is based at our Bengaluru office. A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

10 - 20 years

30 - 35 Lacs

Navi Mumbai

Work from Office

Job Title: Big Data Developer Project Support & Mentorship Location: Mumbai Employment Type: Full-Time/Contract Department: Engineering & Delivery Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. Lead by example in object-oriented development, particularly using Scala and Java. Translate complex requirements into clear, actionable technical tasks for the team. Contribute to the development of ETL processes for integrating data from various sources. Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: 8+ years of professional experience in Big Data development and engineering. Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. Solid object-oriented development experience with Scala and Java. Strong SQL skills with experience working with large data sets. Practical experience designing, installing, configuring, and supporting Big Data clusters. Deep understanding of ETL processes and data integration strategies. Proven experience mentoring or supporting junior engineers in a team setting. Strong problem-solving, troubleshooting, and analytical skills. Excellent communication and interpersonal skills. Preferred Qualifications: Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities.

Posted 2 months ago

Apply

2 - 5 years

11 - 13 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Person at this position has gained significant work experience to be able to apply their knowledge effectively and deliver results. Person at this position is also able to demonstrate the ability to analyse and interpret complex problems and improve change or adapt existing methods to solve the problem. Person at this position regularly interacts with interfacing groups / customer on technical issue clarification and resolves the issues. Also participates actively in important project/ work related activities and contributes towards identifying important issues and risks. Reaches out for guidance and advice to ensure high quality of deliverables. Person at this position consistently seek opportunities to enhance their existing skills, acquire more complex skills and work towards enhancing their proficiency level in their field of specialisation. Works under limited supervision of Team Lead/ Project Manager. Roles & Responsibilities Responsible for design, coding, testing, bug fixing, documentation and technical support in the assigned area. Responsible for on time delivery while adhering to quality and productivity goals. Responsible for adhering to guidelines and checklists for all deliverable reviews, sending status report to team lead and following relevant organizational processes. Responsible for customer collaboration and interactions and support to customer queries. Expected to enhance technical capabilities by attending trainings, self-study and periodic technical assessments. Expected to participate in technical initiatives related to project and organization and deliver training as per plan and quality. Education and Experience Required Engineering graduate, MCA, etc Experience: 2-5 years Competencies Description Data engineering TCB is applicable to one who 1) Creates databases and storage for relational and non-relational data sources 2) Develops data pipelines (ETL/ ELT) to clean , transform and merge data sources into usable format 3) Creates reporting layer with pre-packaged scheduled reports , Dashboards and Charts for self-service BI 4) Has experience on cloud platforms such as AWS, Azure , GCP in implementing data workflows 5) Experience with tools like MongoDB, Hive, Hbase, Spark, Tableau, PowerBI, Python, Scala, SQL, ElasticSearch etc. Platforms- AWS, Azure , GCP Technology Standard- NA Tools- MongoDB, Hive, Hbase, Tableau, PowerBI, ElasticSearch, Qlikview Languages- Python, R, Spark,Scala, SQL Specialization- DWH, BIG DATA ENGINEERING, EDGE ANALYTICS Must to have Skills

Posted 3 months ago

Apply

4 - 9 years

3 - 7 Lacs

Hyderabad

Work from Office

Data Engineer Summary Apply Now Full-Time 4+ years Responsibilities Design, develop, and maintain data pipelines and ETL processes. Build and optimize data architectures for analytics and reporting. Collaborate with data scientists and analysts to support data-driven initiatives. Implement data security and governance best practices. Monitor and troubleshoot data infrastructure and ensure high availability. Qualifications Design, develop, and maintain data pipelines and ETL processes. Build and optimize data architectures for analytics and reporting. Collaborate with data scientists and analysts to support data-driven initiatives. Implement data security and governance best practices. Monitor and troubleshoot data infrastructure and ensure high availability. Skills Proficiency in data engineering tools (Hadoop, Spark, Kafka, etc.). Strong SQL and programming skills (Python, Java, etc.). Experience with cloud platforms (AWS, Azure, GCP). Knowledge of data modeling, warehousing, and ETL processes. Strong problem-solving and analytical abilities.

Posted 3 months ago

Apply

7 - 11 years

50 - 60 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Role :- Resident Solution ArchitectLocation: RemoteThe Solution Architect at Koantek builds secure, highly scalable big data solutions to achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in mind This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the Databricks Lakehouse Platform into the enterprise ecosystem and AWS/Azure/GCP architecture This role is responsible for implementing securely architected big data solutions that are operationally reliable, performant, and deliver on strategic initiatives Specific requirements for the role include: Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake Expert-level hands-on coding experience in Python, SQL ,Spark/Scala,Python or Pyspark In depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, RDD caching, Spark MLib IoT/event-driven/microservices in the cloud- Experience with private and public cloud architectures, pros/cons, and migration considerations Extensive hands-on experience implementing data migration and data processing using AWS/Azure/GCP services Extensive hands-on experience with the Technology stack available in the industry for data management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc Experience using Azure DevOps and CI/CD as well as Agile tools and processes including Git, Jenkins, Jira, and Confluence Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark SQL/Scala Able to build ingestion to ADLS and enable BI layer for Analytics with strong understanding of Data Modeling and defining conceptual logical and physical data models Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualization Responsibilities : Work closely with team members to lead and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigationGuide customers in transforming big data projects,including development and deployment of big data and AI applications Promote, emphasize, and leverage big data solutions to deploy performant systems that appropriately auto-scale, are highly available, fault-tolerant, self-monitoring, and serviceable Use a defense-in-depth approach in designing data solutions and AWS/Azure/GCP infrastructure Assist and advise data engineers in the preparation and delivery of raw data for prescriptive and predictive modeling Aid developers to identify, design, and implement process improvements with automation tools to optimizing data delivery Implement processes and systems to monitor data quality and security, ensuring production data is accurate and available for key stakeholders and the business processes that depend on it Employ change management best practices to ensure that data remains readily accessible to the business Implement reusable design templates and solutions to integrate, automate, and orchestrate cloud operational needs and experience with MDM using data governance solutions Qualifications : Overall experience of 12+ years in the IT field Hands-on experience designing and implementing multi-tenant solutions using Azure Databricks for data governance, data pipelines for near real-time data warehouse, and machine learning solutions Design and development experience with scalable and cost-effective Microsoft Azure/AWS/GCP data architecture and related solutions Experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologies Bachelors or Masters degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience Good to have- - Advanced technical certifications: Azure Solutions Architect Expert, - AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics - AWS Certified Cloud Practitioner, Solutions Architect - Professional Google Cloud Certified Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 3 months ago

Apply

9 - 13 years

50 - 75 Lacs

Bengaluru

Work from Office

Position Summary... What youll do... About Team: Marketplace Engineering team is at the forefront of building core platforms and services to enable Walmart to deliver vast selection at competitive prices and with best in class post-order experience by enabling third-party sellers to list, sell and manage their products to our customers on walmart.com. We do this by managing the entire seller lifecycle, monitoring customer experience, and delivering high-value insights to our sellers to help them plan their assortment, price, inventory. The team also actively collaborates with partner platform teams to ensure we continue to deliver the best experience to our sellers and our customers. This role will be focused on the Marketplace Risk & Fraud Engineering. What youll do: Understand business problems and suggest technology solutions. Architect, design, build and deploy technology solutions at scale Raise the bar on sustainable engineering by improving best practices, producing best in class of code, documentation, testing and monitoring. Estimate effort, identify risks and plan execution. Mentor/coach other engineers in the team to facilitate their development and to provide technical leadership to them.a Rise above details as and when needed to spot broader issues/trends and implications for the product/team as a whole. What youll bring: 10+ years of experience in design and development of highly -scalable applications development in product based companies or R&D divisions. Strong computer systems fundamentals, DS/Algorithms and problem solving skills 5+ years of experience building microservices using JAVA Strong experience with SQL /No-SQL and database technologies (MySQL, Mongo DB, Hbase, Cassandra, Oracle, Postgresql) Experience in systems design and distributed systems. Large scale distributed services experience, including scalability and fault tolerance. Excellent organisation, communication and interpersonal skills About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That s what we do at Walmart Global Tech. We re a team of software engineers, data scientists, cybersecurity experts and service professionals within the world s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, pto, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a walmart for everyone. At walmart, our vision is "everyone included." by fostering a workplace culture where everyone is and feels included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making walmart a welcoming place where all people feel like they belong, we re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal opportunity employer Walmart, inc., is an equal opportunities employer - by choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions - while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelors degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years experience in software engineering or related area.Option 2: 6 years experience in software engineering or related area. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Master s degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 2 years experience in software engineering or related area Primary Location... 4,5,6, 7 Floor, Building 10, Sez, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India

Posted 3 months ago

Apply

7 - 10 years

30 - 35 Lacs

Pune

Remote

Build scalable Python/Django apps Develop reusable APIs and components Integrate REST/SOAP/streaming services Optimize performance and reliability Review code, mentor juniors, and work in Agile teams Required Candidate profile 7–10 yrs in software development 4+ yr in Python/Django/Golang Strong in MySQL, REST APIs, Git, Kafka Exp. with RabbitMQ, Redis Solid in Agile, testing, backend design Must be individual contributor

Posted 3 months ago

Apply

3 - 6 years

14 - 18 Lacs

Bengaluru

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5-8 years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Knowledge or experience of Snowflake will be an added advantage

Posted 3 months ago

Apply

2 - 5 years

14 - 17 Lacs

Bengaluru

Work from Office

As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc). Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies