Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 6.0 years
13 - 18 Lacs
Bengaluru
Work from Office
We are looking to hire Data engineer for the Platform Engineering team. It is a collection of highly skilled individuals ranging from development to operations with a security first mindset who strive to push the boundaries of technology. We champion a DevSecOps culture and raise the bar on how and when we deploy applications to production. Our core principals are centered around automation, testing, quality, and immutability all via code. The role is responsible for building self-service capabilities that improve our security posture, productivity, and reduce time to market with automation at the core of these objectives. The individual collaborates with teams across the organization to ensure applications are designed for Continuous Delivery (CD) and are well-architected for their targeted platform which can be on-premise or the cloud. If you are passionate about developer productivity, cloud native applications, and container orchestration, this job is for you! Principal Accountabilities: The incumbent is mentored by senior individuals on the team to capture the flow and bottlenecks in the holistic IT delivery process and define future tool sets Skills and Software Requirements: Experience with a language such as Python, Go,SQL, Java, or Scala GCP data services (BigQuery; Dataflow; Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage; IAM) Experience with Jenkins, Maven, Git, Ansible, or CHEF Experience working with containers, orchestration tools (like Kubernetes, Mesos, Docker Swarm etc.) and container registries (GCE, Docker hub etc.) Experience with [SPI]aaS- Software-as-a-Service, Platform-as-a-Service, or Infrastructure-as- a-Service Acquire, cleanse, and ingest structured and unstructured data on the cloud Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Enable and support data movement from one system service to another system service Experience implementing or supporting automated solutions to technical problems Experience working in a team environment, proactively executing on tasks while meeting agreed delivery timelines Ability to contribute to effective and timely solutions Excellent oral and written communication skills
Posted 2 weeks ago
8.0 - 10.0 years
32 - 35 Lacs
Hyderabad
Work from Office
Position Summary MetLife established a Global capability center (MGCC) in India to scale and mature Data & Analytics, technology capabilities in a cost-effective manner and make MetLife future ready. The center is integral to Global Technology and Operations with a with a focus to protect & build MetLife IP, promote reusability and drive experimentation and innovation. The Data & Analytics team in India mirrors the Global D&A team with an objective to drive business value through trusted data, scaled capabilities, and actionable insights Role Value Proposition MetLife Global Capability Center (MGCC) is looking for a Senior Cloud data engineer who has the responsibility of building ETL/ELT, data warehousing and reusable components using Azure, Databricks and spark. He/She will collaborate with the business systems analyst, technical leads, project managers and business/operations teams in building data enablement solutions across different LOBs and use cases. Job Responsibilities Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes Develop metadata and configuration based reusable frameworks to reduce the development effort Develop quality code with integral performance optimizations in place right at the development stage. Collaborate with global team in driving the delivery of projects and recommend development and performance improvements. Extensive experience of various databases types and knowledge to leverage the right one for the need. Strong understanding of data tools and ability to leverage them to understand the data and generate insights Hands on experience in building/designing at-scale Data Lake, Data warehouses, data stores for analytics consumption On prem and Cloud (real time as well as batch use cases) Ability to interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions. Education, Technical Skills & Other Critical Requirement Education Bachelors degree in computer science, Engineering, or related discipline Experience (In Years) 8 to 10 years of working experience on Azure Cloud using Databricks or Synapse Technical Skills Experience in transforming data using Python, Spark or Scala Technical depth in Cloud Architecture Framework, Lakehouse Architecture and One Lake solutions. Experience in implementing data ingestion and curation process using Azure with tools such as Azure Data Factory, Databricks Workflows, Azure Synapse, Cosmos DB, Spark (Scala/python), Data bricks . Experience in cloud optimized code on Azure using Databricks, Synapse dedicated SQL Pool and serverless Pools, Cosmos, SQL APIs loading and consumption optimizations. Scripting experience primarily on shell/bash/PowerShell would be desirable. Experience in writing SQL and performing data analysis skills for data anomaly detection and data quality assurance. Other Preferred Skills Expertise in Python and experience writing Azure functions using Python/Node.js Experience using Event Hub for data integrations . Required working knowledge of Azure DevOps pipelines Self-starter and ability to adapt with changing business needs
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
thiruvananthapuram, kerala
On-site
You should have a B.Tech/B.E/MSc/MCA qualification along with a minimum of 10 years of experience. As a Software Architect - Cloud, your responsibilities will include architecting and implementing AI-driven Cloud/SaaS offerings. You will be required to research and design new frameworks and features for various products, ensuring they meet high-quality standards and are designed for scale, resiliency, and efficiency. Additionally, motivating and assisting lead and senior developers for their professional and technical growth, contributing to academic outreach programs, and participating in company branding activities will be part of your role. To qualify for this position, you must have experience in designing and delivering widely used enterprise-class SaaS applications, preferably in Marketing technologies. Knowledge of cloud computing infrastructure, AWS Certification, hands-on experience in scalable distributed systems, AI/ML technologies, big data technologies, in-memory databases, caching systems, ETL tools, containerization solutions like Kubernetes, large-scale RDBMS deployments, SQL optimization, Agile and Scrum development processes, Java, Spring technologies, Git, and DevOps practices are essential requirements for this role.,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
karnataka
On-site
We are looking for someone who is enthusiastic to contribute to the implementation of a metadata-driven platform managing the full lifecycle of batch and streaming Big Data pipelines. This role involves applying ML and AI techniques in data management, such as anomaly detection for identifying and resolving data quality issues and data discovery. The platform facilitates the delivery of Visa's core data assets to both internal and external customers. You will provide Platform-as-a-Service offerings that are easy to consume, scalable, secure, and reliable using open source-based Cloud solutions for Big Data technologies. Working at the intersection of infrastructure and software engineering, you will design and deploy data and pipeline management frameworks using open-source components like Hadoop, Hive, Spark, HBase, Kafka streaming, and other Big Data technologies. Collaboration with various teams is essential to build and maintain innovative, reliable, secure, and cost-effective distributed solutions. Facilitating knowledge transfer to the Engineering and Operations team, you will work on technical challenges and process improvements with geographically distributed teams. Your responsibilities will include designing and implementing agile-innovative data pipeline and workflow management solutions that leverage technology advances for cost reduction, standardization, and commoditization. Driving the adoption of open standard toolsets to reduce complexity and support operational goals for increasing automation across the enterprise is a key aspect of this role. As a champion for the adoption of open infrastructure solutions that are fit for purpose, you will keep technology relevant. The role involves spending 80% of the time writing code in different languages, frameworks, and technology stacks. At Visa, your uniqueness is valued. Working here provides an opportunity to make a global impact, invest in your career growth, and be part of an inclusive and diverse workplace. Join our global team of disruptors, trailblazers, innovators, and risk-takers who are driving economic growth worldwide, moving the industry forward creatively, and engaging in meaningful work that brings financial literacy and digital commerce to millions of unbanked and underserved consumers. This position is hybrid, and the expectation of days in the office will be confirmed by your hiring manager. **Basic Qualifications**: - Minimum of 6 months of work experience or a bachelor's degree - Bachelor's degree in Computer Science, Computer Engineering, or a related field - Good understanding of data structures and algorithms - Good analytical and problem-solving skills **Preferred Qualifications**: - 1 or more years of work experience or an Advanced Degree (e.g., Masters) in Computer Science - Excellent programming skills with experience in at least one of the following: Python, Node.js, Java, Scala, GoLang - MVC (model-view-controller) for end-to-end development - Knowledge of SQL/NoSQL technology. Familiarity with Databases like Oracle, DB2, SQL Server, etc. - Proficiency in Unix-based operating systems and bash scripts - Strong communication skills, including clear and concise written and spoken communications with professional judgment - Team player with excellent interpersonal skills - Demonstrated ability to lead and navigate through ambiguity **Additional Information**:,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Senior Programmer Analyst position is an intermediate level role where you will be responsible for participating in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your primary objective will be to contribute to applications systems analysis and programming activities. Your responsibilities will include conducting tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establishing and implementing new or revised applications systems and programs to meet specific business needs or user areas. You will be responsible for monitoring and controlling all phases of the development process, including analysis, design, construction, testing, and implementation. Additionally, you will provide user and operational support on applications to business users. You will utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, evaluate business processes, system processes, and industry standards, and make evaluative judgments. Furthermore, you will recommend and develop security measures in post-implementation analysis of business usage to ensure successful system design and functionality. You will also consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems. As the Applications Development Senior Programmer Analyst, you will ensure that essential procedures are followed, help define operating standards and processes, and serve as an advisor or coach to new or lower-level analysts. You will have the ability to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, and act as a subject matter expert to senior stakeholders and/or other team members. In this role, you will appropriately assess risk when business decisions are made, demonstrate particular consideration for the firm's reputation, and safeguard Citigroup, its clients, and assets by driving compliance with applicable laws, rules, and regulations. You will be required to have strong analytical and communication skills and must be results-oriented, willing, and able to take ownership of engagements. Additionally, experience in the banking domain is a must. Qualifications: Must Have: - 8+ years of application/software development/maintenance - 5+ years of experience on Big Data Technologies like Apache Spark, Hive, Hadoop - Knowledge of Python, Java, or Scala programming language - Experience with JAVA, Web services, XML, Java Script, Micro services, SOA, etc. - Strong technical knowledge of Apache Spark, Hive, SQL, and Hadoop ecosystem - Ability to work independently, multi-task, and take ownership of various analyses or reviews Good to Have: - Work experience in Citi or Regulatory Reporting applications - Hands-on experience on cloud technologies, AI/ML integration, and creation of data pipelines - Experience with vendor products like Tableau, Arcadia, Paxata, KNIME - Experience with API development and use of data formats Education: - Bachelor's degree/University degree or equivalent experience This is a high-level overview of the job responsibilities and qualifications. Other job-related duties may be assigned as required.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
The role of a Data Engineer is crucial for ensuring the smooth operation of the Data Platform in Azure / AWS Databricks. As a Data Engineer, you will be responsible for the continuous development, enhancement, support, and maintenance of data availability, data quality, performance enhancement, and stability of the system. Your primary responsibilities will include designing and implementing data ingestion pipelines from various sources using Azure Databricks, ensuring the efficient and smooth running of data pipelines, and adhering to security, regulatory, and audit control guidelines. You will also be tasked with driving optimization, continuous improvement, and efficiency in data processes. To excel in this role, it is essential to have a minimum of 5 years of experience in the data analytics field, hands-on experience with Azure/AWS Databricks, proficiency in building and optimizing data pipelines, architectures, and data sets, and excellent skills in Scala or Python, PySpark, and SQL. Additionally, you should be capable of troubleshooting and optimizing complex queries on the Spark platform, possess knowledge of structured and unstructured data design/modelling, data access, and data storage techniques, and expertise in designing and deploying data applications on cloud solutions such as Azure or AWS. Moreover, practical experience in performance tuning and optimizing code running in Databricks environment, demonstrated analytical and problem-solving skills, particularly in a big data environment, are essential for success in this role. In terms of technical/professional skills, proficiency in Azure/AWS Databricks, Python/Scala/Spark/PySpark, HIVE/HBase/Impala/Parquet, Sqoop, Kafka, Flume, SQL, RDBMS, Airflow, Jenkins/Bamboo, Github/Bitbucket, and Nexus will be advantageous for executing the responsibilities effectively.,
Posted 2 weeks ago
4.0 - 7.0 years
6 - 9 Lacs
Noida, India
Work from Office
Spark/PySpark Technical hands on data processing Table designing knowledge using Hive - similar to RDBMS knowledge Database SQL knowledge for retrieval of data - transformation queries such as joins (full, left , right) , ranking , group by Good Communication skills. Additional skills - GitHub , Jenkins , shell scripting would be added advantage Mandatory Competencies Big Data - Big Data - Hadoop Big Data - Big Data - SPARK Big Data - Big Data - Pyspark DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Jenkins Beh - Communication and collaboration Database - Database Programming - SQL DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - GitLab,Github, Bitbucket DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Basic Bash/Shell script writing Big Data - Big Data - HIVE
Posted 2 weeks ago
9.0 - 13.0 years
50 - 55 Lacs
Bengaluru
Work from Office
Position Summary... What you'll do... About Team: This role will be focused on the Marketplace Risk & Fraud Engineering. What you'll do: Understand business problems and suggest technology solutions. Architect, design, build and deploy technology solutions at scale Raise the bar on sustainable engineering by improving best practices, producing best in class of code, documentation, testing and monitoring. Estimate effort, identify risks and plan execution. Mentor/coach other engineers in the team to facilitate their development and to provide technical leadership to them.a Rise above details as and when needed to spot broader issues/trends and implications for the product/team as a whole. What you'll bring: 10+ years of experience in design and development of highly -scalable applications development in product based companies or R&D divisions. Strong computer systems fundamentals, DS/Algorithms and problem solving skills 5+ years of experience building microservices using JAVA Strong experience with SQL /No-SQL and database technologies (MySQL, Mongo DB, Hbase, Cassandra, Oracle, Postgresql) Experience in systems design and distributed systems. Large scale distributed services experience, including scalability and fault tolerance. Excellent organisation, communication and interpersonal skills About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That s what we do at Walmart Global Tech. We re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. . Flexible, hybrid work . Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, pto, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a walmart for everyone. At walmart, our vision is "everyone included." by fostering a workplace culture where everyone is and feels included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making walmart a welcoming place where all people feel like they belong, we re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal opportunity employer Walmart, inc., is an equal opportunities employer - by choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions - while being inclusive of all people. Minimum Qualifications... Minimum Qualifications:Option 1: Bachelor's degree in computer science, computer engineering, computer information systems, software engineering, or related area and 4 years experience in software engineering or related area.Option 2: 6 years experience in software engineering or related area. Preferred Qualifications... Master s degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 2 years' experience in software engineering or related area Primary Location...
Posted 2 weeks ago
7.0 - 12.0 years
11 - 16 Lacs
Bengaluru
Work from Office
As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up . Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII: At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. Target as a tech companyAbsolutely. Were the behind-the-scenes powerhouse that fuels Targets passion and commitment to cutting-edge innovation. We anchor every facet of one of the worlds best-loved retailers with a strong technology framework that relies on the latest tools and technologiesand the brightest peopleto deliver incredible value to guests online and in stores. Target Technology Services is on a mission to offer the systems, tools and support that guests and team members need and deserve. Our high-performing teams balance independence with collaboration, and we pride ourselves on being versatile, agile and creative. We drive industry-leading technologies in support of every angle of the business, and help ensure that Target operates smoothly, securely and reliably from the inside out. Role overview As a Lead Engineer, you serve as the technical anchor for the engineering team that supports a product. You create, own and are responsible for the application architecture that best serves the product in its functional and non-functional needs. You identify and drive architectural changes to accelerate feature development or improve the quality of service (or both). You have deep and broad engineering skills and are capable of standing up an architecture in its whole on your own, but you choose to influence a wider team by acting as a force multiplier. Core responsibilities of this job are described within this job description. Job duties may change at any time due to business needs. Use your skills, experience and talents to be a part of groundbreaking thinking and visionary goals. As a Lead Engineer, youll take the lead as youUse your technology acumen to apply and maintain knowledge of current and emerging technologies within specialized area(s) of the technology domain. Evaluate new technologies and participates in decision-making, accounting for several factors such as viability within Targets technical environment, maintainability, and cost of ownership. Initiate and execute research and proof-of-concept activities for new technologies. Lead or set strategy for testing and debugging at the platform or enterprise level. In complex and unstructured situations, serve as an expert resource to create and improve standards and best practices to ensure high-performance, scalable, repeatable, and secure deliverables. Lead the design, lifecycle management, and total cost of ownership of services. Provide the team with thought leadership to promote re-use and develop consistent, scalable patterns. Participate in planning services that have enterprise impact. Provide suggestions for handling routine and moderately complex technical problems, escalating issues when appropriate. Gather information, data, and input from a wide variety of sources; identify additional resources when appropriate, engage with appropriate stakeholders, and conduct in-depth analysis of information. Provide suggestions for handling routine and moderately complex technical problems, escalating issues when appropriate. Develop plans and schedules, estimate resource requirements, and define milestones and deliverables. Monitor workflow and risks; play a leadership role in mitigating risks and removing obstacles. Lead and participate in complex construction, automation, and implementation activities, ensuring successful implementation with architectural and operational requirements met. Establish new standards and best practices to monitor, test, automate, and maintain IT components or systems. Serve as an expert resource in disaster recovery and disaster recovery planning. Stay current with Targets technical capabilities, infrastructure, and technical environment. Develop fully attributed data models, including logical, physical, and canonical. Influence data standards, policies, and procedures. Install, configure, and/or tune data management solutions with minimal guidance. Monitor data management solution(s) and identify optimization opportunities About you: Bachelor's degree (or equivalent experience) in Computer Science, Engineering, or related field. 7+ years of hands-on software development experience, including at least one full-cycle project implementation. Expertise in Targets technology landscape, with a solid understanding of industry trends, competitors products, and differentiating features. Proficient in Kotlin with advanced knowledge of Microservices architecture and Event-driven architectures . Strong experience with high-priority, large-scale applications capable of processing millions of records. Proven ability to design and implement highly scalable and observable systems . Working on mission-critical applications with large transaction volumes and high throughput. Building systems that are scalable , with a focus on performance and resilience. Leveraging cutting-edge tools for data correlation and pattern analysis. Experience with Scala , Hadoop , and other Big Data technologies is preferred Strong retail domain knowledge with experience working on multi-channel platforms. Hands-on experience with high-performance messaging platforms that are highly scalable. Useful Links: Life at Targethttps://india.target.com/ Benefitshttps://india.target.com/life-at-target/workplace/benefits Culture https://india.target.com/life-at-target/belonging
Posted 2 weeks ago
3.0 - 5.0 years
14 - 18 Lacs
Kochi
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 3-5+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on Azure Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Exposure to streaming solutions and message brokers like Kafka technologies Experience Unix / Linux Commands and basic work experience in Shell Scripting Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers
Posted 2 weeks ago
4.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Big Data Lead with 7-12 years of experience, you will be responsible for leading the development of data processing systems and applications, specifically in the areas of Data Warehousing (DWH). Your role will involve utilizing your strong software development skills in multiple computing languages, with a focus on distributed data processing systems and BIDW programs. You should have a minimum of 4 years of software development experience and a proven track record in developing and testing applications, preferably on the J2EE stack. A sound understanding of best practices and concepts related to Data Warehouse Applications is crucial for this role. Additionally, you should possess a strong foundation in distributed systems and computing systems, with hands-on experience in Spark & Scala, Kafka, Hadoop, Hbase, Pig, and Hive. Experience with NoSQL data stores, data modeling, and data management will be beneficial for this role. Strong interpersonal communication skills are essential, along with excellent oral and written communication abilities. Knowledge of Data Lake implementation as an alternative to Data Warehousing is desirable. Hands-on experience with Spark SQL and SQL proficiency are mandatory requirements for this role. You should have a minimum of 2 end-to-end implementations in either Data Warehousing or Data Lake projects. Your role as a Big Data Lead will involve collaborating with cross-functional teams and driving data-related initiatives to meet business objectives effectively.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You are a Senior Software Engineer at Elevance Health, a prominent health company in America dedicated to enhancing lives and simplifying healthcare. Elevance Health is the largest managed healthcare company in the Blue Cross Blue Shield (BCBS) Association, serving over 45 million lives across 14 states. This Fortune 500 company is currently ranked 20th and led by Gail Boudreaux, a prominent figure in the Fortune list of most powerful women. Your role will be within Carelon Global Solutions (CGS), a subsidiary of Elevance Health, focused on simplifying complex operational processes in the healthcare system. CGS brings together a global team of innovators across various locations, including Bengaluru and Gurugram in India, to optimize healthcare operations effectively and efficiently. As a Senior Software Engineer, your primary responsibility involves collaborating with data architects to implement data models and ensure seamless integration with AWS services. You will be responsible for supporting, monitoring, and resolving production issues to meet SLAs, being available 24/7 for business application support. You should have hands-on experience with technologies like Snowflake, Python, AWS S3-Athena, RDS, Cloudwatch, Lambda, and more. Your expertise should include handling nested JSON files, analyzing daily loads/issues, working closely with admin/architect teams, and understanding complex job and data flows in the project. To qualify for this role, you need a Bachelor's degree in Information Technology/Data Engineering or equivalent education and experience, along with 5-8 years of overall IT experience and 2-9 years in AWS services. Experience in agile development processes is preferred. You are expected to have skills in Snowflake, AWS services, complex SQL queries, and technologies like Hadoop, Kafka, HBase, Sqoop, and Scala. Your ability to analyze, research, and solve technical problems will be crucial for success in this role. Carelon promises limitless opportunities for its associates, emphasizing growth, well-being, purpose, and belonging. With a focus on learning and development, an innovative culture, and comprehensive rewards, Carelon offers a supportive environment for personal and professional growth. Carelon is an equal opportunity employer that values diversity and inclusivity. If you require accommodations due to a disability, you can request the Reasonable Accommodation Request Form. This is a full-time position that offers a competitive benefits package and a conducive work environment.,
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
About Prospecta Founded in 2002 in Sydney, Australia, with additional offices in India, North America, Canada, and a local presence in Europe, the UK, and Southeast Asia, Prospecta is dedicated to providing top-tier data management and automation software for enterprise clients. Our journey began with a mission to offer innovative solutions, leading us to become a prominent data management software company over the years. Our flagship product, MDO (Master Data Online), is an enterprise Master Data Management (MDM) platform designed to streamline data management processes, ensuring accurate, compliant, and relevant master data creation, as well as efficient data disposal. With a strong presence in asset-intensive industries such as Energy and Utilities, Oil and Gas, Mining, Infrastructure, and Manufacturing, we have established ourselves as a trusted partner in the field. Culture at Prospecta At Prospecta, our culture is centered around growth and embracing new challenges. We boast a passionate team that collaborates seamlessly to deliver value to our customers. Our diverse backgrounds create an exciting work environment that fosters a rich tapestry of perspectives and ideas. We are committed to nurturing an environment that focuses on both professional and personal development. Career progression at Prospecta is not just about climbing the corporate ladder but about encountering a continuous stream of meaningful opportunities that enhance personal growth and technical proficiency, all under the guidance of exceptional leaders. Our organizational structure emphasizes agility, responsiveness, and achieving tangible outcomes. If you thrive in a dynamic environment, enjoy taking on various roles, and are willing to go the extra mile to achieve goals, Prospecta is the ideal workplace for you. We continuously push boundaries while maintaining a sense of fun and celebrating victories, both big and small. About the Job Position: Jr. Platform Architect/ Sr. Backend Developer Location: Gurgaon Role Summary: In this role, you will be responsible for implementing technology solutions in a cost-effective manner by understanding project requirements and effectively communicating them to all stakeholders and facilitators. Key Responsibilities - Collaborate with enterprise architects, data architects, developers & engineers, data scientists, and information designers to identify and define necessary data structures, formats, pipelines, metadata, and workload orchestration capabilities. - Possess expertise in service architecture, development, and ensuring high performance and scalability. - Demonstrate experience in Spark, Elastic Search, SQL performance tuning, and optimization. - Showcase proficiency in architectural design and development of large-scale data platforms and data applications. - Hands-on experience with AWS, Azure, and OpenShift. - Deep understanding of Spark and its internal architecture. - Expertise in designing and building new Cloud Data platforms and optimizing them at the organizational level. - Strong hands-on experience in Big Data technologies such as Hadoop, Sqoop, Hive, and Spark, including DevOps. - Solid SQL (Hive/Spark) skills and experience in tuning complex queries. Must-Have - 7+ years of experience. - Proficiency in Java, Spring Boot, Apache Spark, AWS, OpenShift, PostgreSQL, Elastic Search, Message Queue, Microservice architecture, and Spark. Nice-to-Have - Knowledge of Angular, Python, Scala, Azure, Kafka, and various file formats like Parquet, AVRO, CSV, JSON, Hadoop, Hive, and HBase. What will you get Growth Path At Prospecta, your career journey is filled with growth and opportunities. Depending on your career trajectory, you can kickstart your career or accelerate your professional development in a dynamic work environment. Your success is our priority, and as you exhibit your abilities and achieve results, you will have the opportunity to quickly progress into leadership roles. We are dedicated to helping you enhance your experience and skills, providing you with the necessary tools, support, and opportunities to reach new heights in your career. Benefits - Competitive salary. - Health insurance. - Paid time off and holidays. - Continuous learning and career progression. - Opportunities to work onsite at various office locations and/or client sites. - Participation in annual company events and workshops.,
Posted 3 weeks ago
6.0 - 10.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Authorize.net makes it simple to accept electronic and credit card payments in person, online or over the phone. We ve been working with merchants and small businesses since 1996. As a leading payment gateway, Authorize.net is trusted by more than 445,000 merchants, handling more than 1 billion transactions and USD 149 billion in payments every year. As a Senior Staff Software Engineer at Authorize.net (a Visa solution), you will be a hands-on technical leader guide the development of major new features by translating complex business problems into technical solutions that resonate with our merchants and partners. You will also drive cross-team projects that standardize our approach to API development and data schemas, ensuring consistent implementation of best practices across the organization. Beyond features, you will also work on modernization, working across multiple teams to modernize our systems and deliver innovative online payment solutions. You will be instrumental in containerizing applications, splitting monolithic codebases into microservices, and migrating on-premises workloads to the cloud. In addition, you will enable process improvements through robust DevOps practices, incorporating comprehensive release management strategies and optimized CI/CD pipelines. Collaborating with product managers, tech leads, and engineering teams, you will define technology roadmaps, communicate architectural decisions, and mentor engineers in advanced technical approaches. This position requires a solid track record of delivering large-scale, reliable, and secure software solutions. While we prefer C# expertise , knowledge of other modern programming languages is also welcome. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. 15+ years of relevant work experience with a Bachelor s Degree or with an Advanced degree. Advanced level coding skills in C#, .Net Core, ASP.Net. Java experience is a plus Solid experience working with databases, relational / NoSQL, along with writing and optimizing SQL queries. Proficiency developing unit and automation scripts using JUnit, Karate etc. Proficiency working with Message Queuing systems like IBM WebSphere MQ, Kafka etc. Familiarity with continuous delivery and DevOps practices, including infrastructure automation, monitoring, logging, auditing, and security. Hands-on knowledge of cloud platforms (e.g., AWS, Azure, or GCP) for scalable deployment strategies Understanding of integration patterns, API design, and schema standardization for enterprise systems Prior exposure to NoSQL data stores (e.g., HBase, Cassandra) is beneficial Experience with merchant data or payment technology is a plus Excellent communication skills, with a proven ability to mentor and guide engineering teams at scale Essential Functions Drives technical direction for key cross-team and cross-product development projects (through solution design documents and hands-on coding of critical modules). Establishes software development standards and best practices by providing real-world examples and delivering production-ready code. Ensures alignment across all relevant project teams, promoting consistent technical principles, patterns, and standardized frameworks. Provides leadership in solution design discussions to ensure solutions align with platform principles and overarching standards. Mentors and builds world-class, high-performing engineering and data science teams. Applies solution design best practices to increase execution velocity. Develops solutions that are inherently secure, robust, scalable, modular, API-centric, and global. Influences technology choices and decisions that impact the enterprise. Defines deployment approaches in collaboration with peers across the technology organization. Optimizes the use of Visa s platforms and solutions. Demonstrates thought leadership through presentations and teaching, supporting broad technical knowledge sharing. Contributes to key technology initiatives across Visa and ANET.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
B2 - 4.5 - 7 yrs PAN India CBR - 110K Ansible (AAP) tower Design, Build, Upgrades, Administration, Maintenance and Production Support Ansible Certification Preferred. Primary Skills - Linux, Python, Ansible, DevOps tooling, Terraform, Puppet Strong Linux administration skills. Certification preferred. Proficient in Shell Scripting, Python, Ansible Automation. Experience in monitoring tools (Grafana, Prometheus etc) Experience using REST API's to integrate 3rd party tools and provide seamless end to end automation. Develop and maintain ansible playbooks for configuration management and automation. Implement and manage CI/CD pipelines to automate the building, scanning , testing and deployment of applications ensuring rapid and reliable delivery of software releases in adherence to DevSecOps Principles. Manage source code repositories and version control using Github including branch management and code reviews and merge strategies.Experience in using Jenkins, Git, Shell scripts, python and Ansible for automation and build tasks Robust understanding of security technologies (ssl/tls, authentication and authorisation frameworks, directory services, violations and policies, ACLs ) Experience with RBDMS (PostgreSQL, MariaDB, MySQL, Oracle or similar) and NoSQL databases (HBase, MongoDB, CassandraDB, Redis) Familiar with Test automation tools JUnit, Selenium, Cucumber Good Technical Design, Problem Solving and debugging skills Agile proficient and knowledgeable/experience in other agile methodologies. Ideally certified. Additional Skills Knowledge of Hadoop, Airflow, Kafka, Data Flow technologies is added advantage. Contribution to Apache open source, Public GitHub repository with sizeable, big data operations and application code base Analytical approach to problem resolution. Enthusiastic about Big Data technologies, needs to be a proactive learner. 2. Design and execute software developing and reporting Ensure the environment is ready for the execution process designing, test plans, developing test cases/scenarios/usage cases and executing these cases Development of technical specifications and plans and resolution of complex technical design issues Participate and conduct design activities with the development team relating to testing of the automation processes for both functional and non-functional requirements Implement, track, and report key metrics to assure full coverage of functional and non-functional requirements through automation Eliminates errors by owning the testing and validations of codes Track problems, resolutions, and bug fixes throughout the project and create a comprehensive database of defects and successful mitigation techniques Provide resolutions to problems by taking the initiative to use all available resources for research Design and implement automated testing tools when possible, and update tools as needed to ensure efficiency and accuracy Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Develop programs that run efficiently and adhere to WIPRO standards by using similar logic from existing applications, discussing best practices with team members, referencing text books and training manuals, documenting the code and by using accepted design patterns 3. Ensuring smooth flow of communication with customer & internal stakeholders Work with Agile delivery teams to understand product vision and product backlogs; develop robust, scalable, and high quality test automation tests for functional, regression and performance testing Assist in creating acceptance criteria for user stories and generate a test automation backlog Collaborate with Development team to create/improve continuous deployment practices by developing strategies, formalizing processes and providing tools Work closely with business Subject Matter Experts to understand requirements for automation, then designs, builds and deploys the application using automations tools Ensure long term maintainability of the system by documenting projects according to WIPRO guidelines Ensure quality of communication by being clear and effective with test personnel, users, developers, and clients to facilitate quick resolution of problems and accurate documentation of successes Provide assistance to testers and supports personnel as needed to determine system problems Ability to perform backend/database programming for key projects. Stay up-to-date on industry standards and incorporate them appropriately. Design and implement automated testing tools when possible, and update tools as needed to ensure efficiency and accuracya Mandatory Skills: : Ansible Tower Experience: 3-5 Years
Posted 3 weeks ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Research, design, develop, and modify computer vision and machine learning algorithms and models, leveraging experience with technologies such as Caffe, Torch, or TensorFlow - Shape product strategy for highly contextualized applied ML/AI solutions by engaging with customers, solution teams, discovery workshops and prototyping initiatives - Help build a high-impact ML/AI team by supporting recruitment, training and development of team members - Serve as evangelist by engaging in the broader ML/AI community through research, speaking/teaching, formal collaborations and/or other channels Knowledge & Abilities: - Designing integrations of and tuning machine learning & computer vision algorithms - Research and prototype techniques and algorithms for object detection and recognition - Convolutional neural networks (CNN) for performing image classification and object detection - Familiarity with Embedded Vision Processing systems - Open source tools & platforms - Statistical Modeling, Data Extraction, Analysis, - Construct, train, evaluate and tune neural networks Mandatory Skills: : One or more of the following: Java, C++, Python Deep Learning frameworks such as Caffe OR Torch OR TensorFlow, and image/video vision library like OpenCV, Clarafai, Google Cloud Vision etc Supervised & Unsupervised LearningDeveloped feature learning, text mining, and prediction models (e.g., deep learning, collaborative filtering, SVM, and random forest) on big data computation platform (Hadoop, Spark, HIVE, and Tableau) *One or more of the following: Tableau, Hadoop, Spark, HBase, Kafka Experience: - 2-5 years of work or educational experience in Machine Learning or Artificial Intelligence - Creation and application of Machine Learning algorithms to a variety of real-world problems with large datasets - Building scalable machine learning systems and data-driven products working with cross functional teams - Working w/ cloud services like AWS, Microsoft, IBM, and Google Cloud- Working w/ one or more of the following: Natural Language Processing, text understanding, classification, pattern recognition, recommendation systems, targeting systems, ranking systems or similar Nice to Have: - Contribution to research communities and/or efforts, including publishing papers at conferences such as NIPS, ICML, ACL, CVPR, etc Education: BA/BS (advanced degree preferable) in Computer Science, Engineering or related technical field or equivalent practical experience Wipro is an Equal Employment Opportunity employer and makes all employment and employment-related decisions without regard to a person's race, sex, national origin, ancestry, disability, sexual orientation, or any other status protected by applicable law Product and Services Sales Manager Mandatory Skills: : Generative AI Experience: 3-5 Years
Posted 3 weeks ago
7.0 - 12.0 years
11 - 15 Lacs
Gurugram
Work from Office
Project description We are looking for an experienced Data Engineer to contribute to the design, development, and maintenance of our database systems. This role will work closely with our software development and IT teams to ensure the effective implementation and management of database solutions that align with client's business objectives. Responsibilities The successful candidate would be responsible for managing technology in projects and providing technical guidance/solutions for work completion (1.) To be responsible for providing technical guidance/solutions (2.) To ensure process compliance in the assigned module and participate in technical discussions/reviews (3.) To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations (4.) Being self-organized, focused on develop on time and quality software Skills Must have At least 7 years of experience in development in Data specific projects. Must have working knowledge of streaming data Kafka Framework (kSQL/Mirror Maker etc) Strong programming skills in at least one of these programming language Groovy/Java Good knowledge of Data Structure, ETL Design, and storage. Must have worked in streaming data environments and pipelines Experience working in near real-time/Streaming Data pipeline development using Apache Spark/Streamsets/ Apache NIFI or similar frameworks Nice to have N/A
Posted 3 weeks ago
5.0 - 8.0 years
9 - 14 Lacs
Chennai
Work from Office
Who You ll Work With CloudVision is Arista s enterprise network management and streaming telemetry SaaS offering, serving the world s largest Financials, Media and Entertainment, Health Care, and Cloud companies. As we continue to scale the service and expand into new markets, we re looking to grow the team with experienced Software Engineers anchored by our Bangalore and Pune team. CloudVision s core infrastructure is a scale-out distributed system providing real-time and historical access to the full network state, along with frameworks for building advanced analytics. It s written in go and leverages open source technologies like HBase, ClickHouse, ElasticSearch, and Kafka under the covers. We re constantly investing in scaling out the platform and building out richer analytics capabilities in the infrastructure. On top of this core platform we are building network management and analytics applications to fully automate today s enterprise network, from CI/CD pipelines for network automation, to advanced analytics and remediation for network assurance. What You ll Do As a backend software engineer at Arista, you own your project end to end. You and your project team will work with product management and customers to define the requirements and design the architecture. You ll build the backend, write automated tests, and get it deployed into production via our CD pipeline. As a senior member of the team you ll also be expected to help mentor and grow new team members. This role demands a strong and broad software engineering background, and you won t be limited to any single aspect of the product or development process. BS/MS degree in Computer Science and 8+ years of relevant experience. Strong knowledge of one or more of programming languages (Go, Python, Java) Experience developing distributed systems or scale out applications for a Saa
Posted 3 weeks ago
4.0 - 8.0 years
15 - 20 Lacs
Chennai
Work from Office
Who You ll Work With CloudVision is Arista s enterprise network management and streaming telemetry SaaS offering, serving the world s largest Financials, Media and Entertainment, Health Care, and Cloud companies. As we continue to scale the service and expand into new markets, we re looking to grow the team with experienced Software Engineers anchored by our Bangalore and Pune team. CloudVision s core infrastructure is a scale-out distributed system providing real-time and historical access to the full network state, along with frameworks for building advanced analytics. It s written in go and leverages open source technologies like HBase, ClickHouse, ElasticSearch, and Kafka under the covers. We re constantly investing in scaling out the platform and building out richer analytics capabilities in the infrastructure. On top of this core platform we are building network management and analytics applications to fully automate today s enterprise network, from CI/CD pipelines for network automation, to advanced analytics and remediation for network assurance. What You ll Do As a backend software engineer at Arista, you own your project end to end. You and your project team will work with product management and customers to define the requirements and design the architecture. You ll build the backend, write automated tests, and get it deployed into production via our CD pipeline. As a senior member of the team you ll also be expected to help mentor and grow new team members. This role demands a strong and broad software engineering background, and you won t be limited to any single aspect of the product or development process. BS/MS degree in Computer Science and 8+ years of relevant experience. Strong knowledge of one or more of programming languages (Go, Python, Java) Experience developing distributed systems or scale out applications for a Saa
Posted 3 weeks ago
2.0 - 5.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Do Research, design, develop, and modify computer vision and machine learning. algorithms and models, leveraging experience with technologies such as Caffe, Torch, or TensorFlow. - Shape product strategy for highly contextualized applied ML/AI solutions by engaging with customers, solution teams, discovery workshops and prototyping initiatives. - Help build a high-impact ML/AI team by supporting recruitment, training and development of team members. - Serve as evangelist by engaging in the broader ML/AI community through research, speaking/teaching, formal collaborations and/or other channels. Knowledge & Abilities: - Designing integrations of and tuning machine learning & computer vision algorithms - Research and prototype techniques and algorithms for object detection and recognition - Convolutional neural networks (CNN) for performing image classification and object detection. - Familiarity with Embedded Vision Processing systems - Open source tools & platforms - Statistical Modeling, Data Extraction, Analysis, - Construct, train, evaluate and tune neural networks Mandatory Skills: One or more of the following: Java, C++, Python Deep Learning frameworks such as Caffe OR Torch OR TensorFlow, and image/video vision library like OpenCV, Clarafai, Google Cloud Vision etc Supervised & Unsupervised LearningDeveloped feature learning, text mining, and prediction models (e.g., deep learning, collaborative filtering, SVM, and random forest) on big data computation platform (Hadoop, Spark, HIVE, and Tableau) *One or more of the following: Tableau, Hadoop, Spark, HBase, Kafka Experience:- 2-5 years of work or educational experience in Machine Learning or Artificial Intelligence - Creation and application of Machine Learning algorithms to a variety of real-world problems with large datasets. - Building scalable machine learning systems and data-driven products working with cross functional teams - Working w/ cloud services like AWS, Microsoft, IBM, and Google Cloud- Working w/ one or more of the following: Natural Language Processing, text understanding, classification, pattern recognition, recommendation systems, targeting systems, ranking systems or similar Nice to Have: - Contribution to research communities and/or efforts, including publishing papers at conferences such as NIPS, ICML, ACL, CVPR, etc. Education: BA/BS (advanced degree preferable) in Computer Science, Engineering or related technical field or equivalent practical experience Wipro is an Equal Employment Opportunity employer and makes all employment and employment-related decisions without regard to a person's race, sex, national origin, ancestry, disability, sexual orientation, or any other status protected by applicable law Product and Services Sales Manager Mandatory Skills: Generative AI. Experience:3-5 Years.
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
maharashtra
On-site
Req ID: 317575 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Big Data Senior Developer to join our team in Pune, Mahrshtra (IN-MH), India (IN). Bachelor degree with 6-10 years of programming experience on data platforms for Sr. Data Developer. Expertise in Data Engineering. Hands-on experience in design and development of big data platform. Deep understanding of data processing technology stacks: Spark, HBase, Hive and other Hadoop ecosystem technologies and development using Scala or Java. Deep understanding of streaming data architectures and technologies for real-time and low-latency data processing Experience with agile development methods including core values, guiding principles, and key agile practices Understanding of the theory and application of Continuous Integration/Delivery Experience with NoSQL technologies including column family, graph, document, and key-value data storage technologies is a plus. Understanding of relational databases, SQL and PLSQL. Passion for software craftsmanship. Experience in Financial Industry is a plus. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com ,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You will be responsible for managing one or more applications to achieve established goals and handle personnel duties for your team, including hiring and training. Your role involves designing and developing real-time and batch data transformation processes using a variety of technologies such as Hadoop, Spark Stream, Spark SQL, Python, and Hive. You will also design and develop programs to enhance functionalities in the next-generation Big Data platform and ensure data redistribution is authorized. As a Big Data Developer with 8-10 years of relevant experience, you must possess strong skills in Java/J2EE, Hadoop, Scala, Hive, Impala, Kafka, and Elastic to address data concerns and implement data remediation requirements. Your role will require you to have a good understanding of design patterns and the ability to provide solutions to complex design issues, as well as identify and resolve code issues. You will be hands-on in managing application development using Spark (Scala, Python, or Java), SQL, and the Linux-based Hadoop ecosystem (HDFS, Impala, Hive, HBase, etc.). Your experience as a senior-level professional in an Applications Development role and your proven Solution Delivery skills will be essential in this position. Additionally, you should have a basic knowledge of finance industry practices and standards. Excellent analytical and process-based skills are required, including expertise in process flow diagrams, business modeling, and functional design. Being dynamic, flexible, and maintaining a high energy level is crucial as you will be working in a demanding and rapidly changing environment. Your educational background should include a Bachelor's degree/University degree or equivalent experience.,
Posted 3 weeks ago
5.0 - 7.0 years
6 - 10 Lacs
Pune
Work from Office
Role & Responsibilities Build Robust and scalable web-based applications. You will need to think of platforms & reuse. Build abstractions and contracts with separation of concerns for a larger scope. Drive problem-solving skills for high-level business and technical problems. Do high-level design with guidance; Functional modeling, break-down of a module. Do incremental changes to architectureimpact analysis of the same. Do performance tuning and improvements in large scale distributed systems. Mentor young minds and foster team spirit, break down execution into phases to bringpredictability to overall execution. Work closely with Product Manager to derive capability views from features/solutions, Leadexecution of medium-sized projects. Work with broader stakeholders to track the impact of projects/features and proactively iterateto improve them. Requirements- Strong experience in the art of writing code and solving problems on a Large Scale (FinTechexperience preferred). B.Tech, M.Tech, or Ph.D. in Computer Science or related technical discipline (or equivalent). Excellent coding skills should be able to convert the design into code fluently. Experience in atleast one general programming language (e.g. Java, C, C++) & tech stack to write maintainable, scalable, unit-tested code. Experience with multi-threading, concurrency programming, object-oriented design skills,knowledge of design patterns, and huge passion and ability to design intuitive modules,class-level interfaces and knowledge of Test driven development. Good understanding of databases (e.g. MySQL) and NoSQL (e.g. HBase, Elasticsearch, Aerospike, etc). Experience in full life cycle development in any programming language on a Linux platform and building highly scalable business applications, which involve implementing large complex business flows and dealing with a huge amount of data. Strong desire for solving complex and interesting real-world problems. Go-getter attitude that reflects in energy and intent behind assigned tasks An open communicator who shares thoughts and opinions frequently listens intently and takesconstructive feedback. Ability to drive the design and architecture of multiple subsystems. Ability to break-down larger/fuzzier problems into smaller ones in the scope of the product Understanding of the industrys coding standards and an ability to create appropriate technicaldocumentation. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog. Life at PhonePe PhonePe in the news
Posted 3 weeks ago
7.0 - 12.0 years
20 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Greetings from Grid Dynamics! We are currently looking for Lead Full Stack Developer". Please find the JD below for your reference. Job Description: Details on tech stack Design and implement intuitive and responsive user interfaces using React.js or similar front-end technologies. Collaborate with stakeholders to create a seamless user experience. Create mockups and UI prototypes for quick turnaround using Figma, Canva, or similar tools. Strong proficiency in HTML, CSS, JavaScript, and React.js. Experience with styling and graph libraries such as Highcharts, Material UI, and Tailwind CSS. Solid understanding of React fundamentals, including Routing, Virtual DOM, and Higher-Order Components (HOC). Knowledge of REST API integration. Understanding of Node.js is a big advantage. Experience with REST API development, preferably using FastAPI. Proficiency in programming languages like Python, Java. Integrate APIs and services between front-end and back-end systems. Experience with Docker and containerized applications. Experience with orchestration tools such as Apache Airflow or similar. Design, develop, and manage simple data pipelines using Databricks, PySpark, and Google BigQuery. Medium-level expertise in SQL. Basic understanding of authentication methods such as JWT and OAuth. Experience working with cloud platforms such as AWS, GCP, or Azure. Familiarity with Google BigQuery and Google APIs. Hands-on experience with Kubernetes for container orchestration.
Posted 3 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AIX System Administration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time educationYour Role and ResponsibilitiesAs AIX Administrator, you are responsible for installation, implementation, customization, operation, recovery and performance tuning with proven knowledge of the fundamental concepts and technologies associated with AIX Operating Systems.Responsibilities:Installation, configuration and troubleshooting of AIX & Unix Operating systemClustering and High availability managementUNIX, AIX Server Support:Designing, troubleshooting and storage implementation, articulate standard methodologies for during implementationSystems running on UNIX / AIX platforms and OS clustering, partitioning and virtualizationHandle day-to-day UNIX-AIX operating system installation, migration and break-fix supportSAN S/W and storage administration and integration with operating systemsRaising and working with PMR teamInstalling, configuring and maintaining IBM AIX and Unix ServersInstallation and configuration of VirtualizationInstallation and configuration of cluster environment using HACMPConfiguration and administration of Logical volume managerPatch and Package administrationWriting shell scripts to accomplish day to day system administration taskConfiguring and supporting Domains, LPARs, DLPARsAdministrator & configure various FS like JFS, VxFS, Pseudo FSTroubleshooting Hardware and Operating system related issueCapacity planning and fine tune system for optimal performanceUnderstanding of SAN and NAS storage.Administration of NIS or LDAP environmentRequired Technical and Professional ExpertiseMinimum 7 years of experience in IT Industry Unix AdministrationAIX administration and Linux administration (Redhat/Suse/Ubuntu), Automation experience in OS Patching and upgradesAbility to work independently with Vendor to resolve issues (OS and Hardware)Proven working experience in installing, configuring middleware and troubleshooting Linux/AIX based environmentKnowledge on SAN and NAS storage Experience with Physical, Virtual and containerized environmentPreferred Technical and Professional ExpertiseProactive monitoring and capacity planning experienceKnowledge on Ansible or other automation tool is must Scripting KnowledgeBash/Python/Perl to automate day to day activitiesWilling to adopt new technology Expertise in Cluster configuration and troubleshootingWorking knowledge of Incident and Change Management Qualification 15 years full time education
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough