Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 11.0 years
15 - 25 Lacs
bengaluru
Remote
*Min 5 years of experience in data engineering, specifically working with *Apache Airflow and AWS technologies. *AWS services, particularly S3, Glue, EMR, Redshift, and AWS Lambda. *Proficiency in Python, PySpark and SQL *Snowflake Data Lake is reqd.
Posted 6 days ago
6.0 - 11.0 years
0 Lacs
hyderabad, pune
Work from Office
Strong technical acumen in AWS including expert working knowledge of AWS services such as Glue, ,EC2, ECS, Lamda, Step Functions, IAM, Athena, HUE, Presto, S3, Redshift etc. Strong technical acumen in Data Engineering enablement, and working knowledge of frameworks/languages such as Python, SPARK etc. Dremino knowledge is a plus.
Posted 6 days ago
6.0 - 10.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer with expertise in Kafka, MongoDB, and OpenShift (AWS), you will play a crucial role in designing, building, and maintaining robust data pipelines to ensure the seamless flow of data across different systems. Your primary responsibility will involve collaborating with data scientists, analysts, and stakeholders to enhance data processing and storage solutions. Your key responsibilities will include designing and implementing scalable data pipelines using Kafka for real-time data ingestion and processing. You will also be tasked with managing and optimizing MongoDB databases to ensure high availability and performance levels. Additionally, you will develop containerized Springboot/Java applications and deploy them using OpenShift for efficient operations. Leveraging AWS services such as S3 and Redshift for data storage, processing, and analysis will be an essential part of your role. In this position, you will work closely with cross-functional teams to gather requirements and deliver data solutions that align with business needs. Monitoring and troubleshooting data pipeline performance and implementing necessary improvements will also be a key aspect of your responsibilities. The ideal candidate for this role should have 6 to 10 years of relevant experience and the position is based in Hyderabad.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You are an experienced Senior Backend Software Engineer (L3) with a strong background in backend development and scalable architecture. Your primary responsibility will be to design, develop, and maintain robust backend systems that support highly scalable products. You will work closely with cross-functional teams, taking the lead on backend initiatives and ensuring that you are up to date with the latest trends and best practices in the industry. To excel in this role, you must have a minimum of 8 years of professional software engineering experience, with a focus on backend development. Your expertise should include proficiency in backend technologies such as Node.js, TypeScript, Cassandra, Redis, Elasticsearch, and MySQL. You should also have a solid understanding of designing and developing scalable, microservices-based architectures. Familiarity with cloud platforms like AWS, S3, WebRTC, FCM & APNS, and HMS push notifications is essential. Additionally, you should have experience working with CI/CD tools like Jenkins, Docker, and Kubernetes, as well as messaging and streaming tools like RabbitMQ/Kafka. Strong testing and debugging skills using tools like Jest, Mocha, and Chai are also required, along with excellent collaboration and communication abilities. While not mandatory, experience with frontend technologies like React.js, XMPP, and Scaling Sockets would be beneficial. Familiarity with Agile/Scrum methodologies and knowledge of system architecture improvements and performance optimization techniques are also considered advantageous. In terms of qualifications, you should hold a Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field. You should have a minimum of 8 years of hands-on experience in backend software development, with a proven track record of building scalable products and thriving in fast-paced environments. If you meet these criteria and are looking for a challenging opportunity to further develop your backend software engineering skills, we encourage you to apply for this position and be part of our dynamic team.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Backend Developer at ARK Infosoft, you will leverage your 6+ years of experience in backend development to build and maintain robust backend services. Your strong command of Node.js and TypeScript will be essential in designing scalable systems with PostgreSQL and Redis. You will play a key role in integrating and optimizing OpenAI APIs such as ChatGPT and GPT-4, while also developing containerized apps with AWS ECS. Your responsibilities will include managing storage and routing using S3 and API Gateway, setting up monitoring through CloudWatch and CloudTrail, and writing unit and integration tests using testing frameworks like Jest and Mocha. Collaboration with cross-functional teams to troubleshoot and optimize performance issues will be crucial, as well as ensuring adherence to best practices in code, architecture, and security. Key Requirements: - 6+ years of backend development experience - Strong command of Node.js and TypeScript - Solid experience with PostgreSQL and Redis - Proficiency in RESTful API development - Practical knowledge of AWS (ECS, S3, API Gateway) - Experience with OpenAI APIs (ChatGPT, GPT-4, etc.) - Familiarity with testing frameworks (Jest, Mocha) - Experience working with large-scale systems - Understanding of secure coding and data privacy - Excellent problem-solving and communication skills Skills: - Node.js & TypeScript - PostgreSQL - Redis Cache - RESTful APIs - Express.js - OpenAI API - AWS ECS, S3, API Gateway - CloudWatch & CloudTrail - Jest / Mocha - Scalable architecture - Security compliance We Offer: - 5 Day Working - Paid Leave - All Leave Encashment - Health Insurance - Festival Celebration - Employee Engagement - Great Work Space - Annual Picnic ARK Infosoft's Mission: At ARK Infosoft, our mission is to empower businesses to grow by providing them with personalized IT solutions. We achieve this by leveraging the latest technologies, prioritizing our customers, and continuously seeking innovative ways to improve. Our goal is to be the most trusted partner in our clients" journey towards digital advancement. We are dedicated to excellence, integrity, and social responsibility, aiming to create lasting value for our clients, employees, and communities. We are committed to ensuring a sustainable and prosperous future for all. ARK Infosoft's Vision: Our vision is to establish ARK Infosoft as a leading IT company renowned for delivering creative solutions that help businesses succeed in today's digital world. We aim to be the go-to partner for all their IT needs, fostering growth and prosperity in every industry we serve. By prioritizing innovation, collaboration, and sustainability, we envision a future where technology empowers businesses to reach their full potential. We strive to drive positive global impact, shaping a brighter tomorrow by enabling businesses to thrive in the digital age.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
kochi, kerala
On-site
The role of a Senior Node.js Developer involves leading enterprise-grade backend systems with complex business workflows in domains such as BFSI, ERP, healthcare, or logistics. As a Senior Node.js Developer, you will be responsible for architecting scalable solutions, managing cross-functional teams (React.js/iOS), and overseeing end-to-end delivery from development to deployment and client communication. Your key responsibilities will include technical leadership, where you will design and develop enterprise Node.js applications and optimize complex business workflows. You will also manage deployment pipelines, lead developers, mentor the team on best practices, and interface directly with clients for requirements and troubleshooting. Additionally, you will be responsible for overseeing server management, monitoring, disaster recovery, and optimizing MongoDB clusters. The ideal candidate should have a strong core expertise in Node.js and MongoDB, deployment mastery in CI/CD and cloud services, as well as leadership and communication skills. Preferred skills for this role include basic knowledge of React.js and iOS native, infrastructure-as-code experience, and familiarity with TypeScript, Redis, or Elasticsearch. Non-negotiable requirements include an enterprise application background, deployment ownership, English fluency, a valid passport, readiness to travel to Dubai, on-site work in Kochi, and immediate joining. In return, we offer a competitive salary, performance bonuses, opportunities to work on global enterprise projects, and support for upskilling in cloud infrastructure and leadership.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
As an experienced AWS Solution Architect, you will be responsible for designing, implementing, and optimizing the cloud infrastructure for our clients" enterprise-level web application. Your primary focus will be on architecting, deploying, and managing the end-to-end AWS environment to ensure it meets business requirements, security standards, and performance objectives. Your key responsibilities will include designing and optimizing the overall AWS cloud architecture, configuring security controls, managing access, and ensuring compliance requirements. You will also need to establish high availability and resilience mechanisms, monitor and optimize infrastructure performance, and maintain CI/CD pipelines for efficient deployments. Collaboration with development, DevOps, and database teams is essential to understand their requirements and integrate cloud solutions accordingly. Additionally, you will play a crucial role in mentoring and upskilling the cross-functional team on AWS best practices, architecture patterns, and design principles. Providing ongoing support, troubleshooting, and documentation for the cloud environment will be part of your daily tasks. To be successful in this role, you should have at least 10 years of experience as an AWS Solution Architect and a deep understanding of AWS services such as VPC, EC2, RDS, ECS, CloudWatch, and security services. Proficiency in designing highly available, scalable, and secure cloud architectures, along with knowledge of containerization, CI/CD, and Infrastructure as Code (IaC) tools, is required. Strong problem-solving skills, collaboration abilities, and communication skills are also essential. Preferred qualifications include holding an AWS Certified Solutions Architect - Professional certification.,
Posted 1 week ago
12.0 - 15.0 years
30 - 45 Lacs
hyderabad
Work from Office
We are Hiring: AWS Data Architect at Coforge Ltd. Join Coforge Ltd as a Lead AWS Data Architect. Job Location: Hyderabad (Onsite Only) Experience Required: 1215 Years Position Type: Full-Time Company: Coforge Ltd. How to Apply Interested candidates can share their CV directly with: Gaurav.2.Kumar@coforge.com WhatsApp: 9667427662 About the Role:- Coforge Ltd is seeking a visionary Lead AWS Data Architect to spearhead our cloud-first data initiatives. This is a strategic leadership role where youll shape the future of data architecture, mentor top-tier talent, and deliver scalable, secure, and innovative solutions using cutting-edge AWS technologies. Key Responsibilities:- Team Leadership & Mentorship:- Guide and grow a team of skilled data engineers and architects, fostering a culture of excellence and innovation. Cloud Data Architecture:- Design and implement robust, scalable data pipelines using Python , Airflow , and AWS services like S3 , Glue , and EMR . Real-Time Data Streaming:- Architect real-time data solutions using Kafka , Amazon SQS , and EventBridge to enable responsive and intelligent systems. System Integration:- Seamlessly connect diverse systems using AppFlow , REST APIs , and other integration tools. Data Warehousing & Modeling:- Build optimized data warehouses with strong dimensional modeling practices to support analytics and reporting. Governance & Security:- Ensure all solutions comply with enterprise data governance, privacy, and security standards. What Were Looking For:- Experience:- 1015 years in data engineering and architecture. Minimum 3 years in a technical leadership role. Technical Skills:- Expert in Python , Airflow , and AWS ecosystem. Hands-on experience with Kafka , SQS , EventBridge. Strong understanding of data warehousing , ETL/ELT and API integrations. Education:- Bachelors or masters degree in computer science, Engineering, or a related field. Soft Skills:- Excellent communication, stakeholder management, and problem-solving abilities. Passion for mentoring and driving team success. Why Coforge? At Coforge, we’re not just building systems—we’re transforming industries. Join a team that values innovation, collaboration, and continuous learning. Be part of a data-driven revolution. About Coforge Ltd Coforge Ltd is a globally recognized digital services and solutions provider, headquartered in Noida, Uttar Pradesh, India . Formerly known as NIIT Technologies , the company rebranded to Coforge in August 2020 , marking a strategic shift toward deeper specialization and innovation in digital transformation. With over 40 years of industry experience , Coforge operates in more than 21 countries , including the United States, United Kingdom, Australia, Singapore, and across Europe and Asia-Pacific. It maintains 30 global delivery centers and employs a workforce of over 32,000 professionals , delivering high-impact solutions across industries such as: Banking & Financial Services Insurance Travel & Transportation Healthcare Manufacturing & Distribution Media & Government Coforge’s core service offerings include: Digital Engineering & Application Development Cloud Infrastructure & Automation Artificial Intelligence & Data Management Cybersecurity & Digital Process Automation Enterprise Applications including SAP Business Process Services (BPS) The company is known for its product engineering approach , leveraging proprietary platforms and emerging technologies like Generative AI , Cloud , and Data Integration to help clients become intelligent, high-growth enterprises.
Posted 1 week ago
6.0 - 11.0 years
15 - 30 Lacs
bengaluru
Hybrid
Role: Java Developer with AWS Primary skills: Java, Springboot, Microservices, SQL/PLSQL, DB, AWS and Kafka Job Description: Have strong working experience in Java Backend developer Experience in SQL/PLSQL in any database Technologies like Sybase, DB2, DynamoDB, MongoDB Implemented Docker-Microservices, RESTful API Architecture Good understanding of Core AWS Services including AWS EC2, S3, ALB, NATGW, EFS, Lambda, APIGW Ability to create AWS Infrastructure Design including DR at AWS Infra Level Knowledge in Infra Provisioning using Terraform in AWS Should possess strong understanding of event driven messaging using Kafka
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
You will be responsible for designing, building, and maintaining scalable data pipelines using AWS services. Your role will involve integrating diverse data sources to ensure data consistency and reliability. Collaboration with data scientists and stakeholders to understand data requirements will be essential. Implementing data security measures and maintaining data integrity will be crucial aspects of your job. Monitoring and troubleshooting data pipelines to ensure optimal performance will also be part of your responsibilities. Additionally, you will be expected to optimize and maintain data warehouse and data lake architectures while creating and maintaining comprehensive documentation for data engineering processes. To qualify for this role, you must hold a Bachelor's degree in computer science, Information Technology, or a related field. Proven experience as a Data Engineer with a focus on AWS is required. A strong understanding of data modelling, ETL processes, and data warehousing is essential. Experience with SQL and NoSQL databases is necessary, along with familiarity with data governance and data security best practices. Proficiency in AWS services such as Redshift, S3, RDS, Glue, Lambda, and API Gateway is expected. Experience with data pipeline orchestration tools like Apache Airflow and proficiency in programming languages such as Python or Java will be advantageous.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a FullStack Developer at FYND, with over 3 years of experience, your primary responsibility will be to build scalable and loosely coupled services to enhance the platform. You will be tasked with developing bulletproof API integrations with various third-party APIs to cater to different use cases. Your role will also involve evolving the infrastructure to improve overall availability. In this position, you will have complete autonomy over your code, enabling you to choose the technologies and tools required to develop and operate large-scale applications on AWS. Moreover, you will be encouraged to contribute to the open-source community by sharing your expertise through code contributions and blog posts. Given that FYND is a startup, you should be prepared for frequent changes as the company continues to experiment with product enhancements. Specific requirements for this role include a minimum of 3 years of development experience, particularly in consumer-facing web/app products. Proficiency in JavaScript is essential, although exceptions can be made for candidates with expertise in other languages, provided they have experience in developing web/app-based tech products. Ideally, you should have hands-on experience with Node.JS and familiarity with frameworks such as Express.js, Koa.js, or Socket.io. A strong understanding of async programming using Callbacks, Promises, and Async/Await is necessary. Additionally, you should be proficient in frontend technologies like HTML, CSS, and AJAX, along with working knowledge of databases like MongoDB, Redis, and MySQL. A good grasp of Data Structures, Algorithms, and Operating Systems is crucial for this role. Experience with AWS services, including EC2, ELB, AutoScaling, CloudFront, and S3, is preferred. Knowledge of frontend stacks like HTML and CSS will be advantageous, and familiarity with Vue.js is a plus. While you may not be familiar with all the tools used at FYND, you should demonstrate a willingness to learn with the guidance and available resources. Your ability to adapt to new technologies and contribute effectively to the team's goals will be key to your success in this role.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Solutions Specialist at Quantiphi, you will be part of a global and diverse culture that values transparency, diversity, integrity, learning, and growth. We take pride in fostering an environment that encourages innovation and excellence, not only in your professional endeavors but also in your personal life. Key Responsibilities: - Utilize your 3+ years of hands-on experience to deliver data solutions that drive business outcomes. - Develop data pipelines using PySpark within Databricks implementations. - Work with Databricks Workspaces, Notebooks, Delta Lake, and APIs to streamline data processes. - Utilize your expertise in Python, Scala, and advanced SQL for effective data manipulation and optimization. - Implement data integration projects using ETL to ensure seamless data flow. - Build and deploy cloud-based solutions at scale by ingesting data from sources like DB2. Preferred Skills: - Familiarity with AWS services such as S3, Redshift, and Secrets Manager. - Experience in implementing data integration projects using ETL, preferably with tools like Qlik Replicate & QlikCompose. - Proficiency in using orchestration tools like Airflow or Step Functions for workflow management. - Exposure to Infrastructure as Code (IaC) tools like Terraform and Continuous Integration/Continuous Deployment (CI/CD) tools. - Previous involvement in migrating on-premises data to the cloud and processing large datasets efficiently. - Knowledge in setting up data lakes and data warehouses on cloud platforms. - Implementation of industry best practices to ensure high-quality data solutions. If you are someone who thrives in an environment of wild growth and enjoys collaborating with happy, enthusiastic over-achievers, Quantiphi is the place for you to nurture your career and personal development. Join us in our journey of innovation and excellence!,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The ideal candidate for this role should possess a strong skill set in Java 8/11, REST/HTTP microservices, gradle-based CI/CD pipelines, Terraform, Docker/Kubernetes, and AWS services such as DynamoDB, S3/cloud front, elasti cache/Redis, OpenSearch, ECS, EC2, load balancing, ASGs, and CloudWatch. Knowledge of Kafka is also required for this position. Additionally, experience with Microsoft M365 systems, MS Graph API, and Azure services is highly desirable. Proficiency in Python is a must-have skill for this role. As a part of this role, you will be actively involved in the complete development lifecycle of our core cloud-based email product, including designing, developing, testing, deploying, maintaining, monitoring, and enhancing the product. Collaboration with architects, engineers, and product managers to address complex challenges at a large scale will be a key aspect of this position. Your responsibilities will include delivering AWS-based Java services through CI/CD and Infrastructure as code, contributing to reviews of new features, ensuring software quality through the creation of unit, system, and non-functional tests, and actively engaging in team collaboration and problem-solving. Furthermore, you will be expected to stay updated with new technology trends, take initiatives, and demonstrate resourcefulness in designing new features or enhancements based on high-level architectures. Leadership qualities are essential for this role as you will lead backlog grooming, planning, design reviews, code reviews, and security reviews of designs and implementations. Mentoring and coaching team members, improving team efficiency, and fostering a culture of continuous learning and development are key aspects of this position. Your role will also involve applying technology trends and industry innovations to enhance our products and taking ownership to create accountability within the team.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us as a Senior Developer at Barclays, where you will play a crucial role in supporting the successful delivery of Location Strategy projects while adhering to plan, budget, quality, and governance standards. Your primary responsibility will be to drive the evolution of our digital landscape, fostering innovation and excellence. By leveraging cutting-edge technology, you will lead the transformation of our digital offerings, ensuring unparalleled customer experiences. To excel in this role as a Senior Developer, you should possess the following experience and skills: - Solid hands-on development experience with Scala, Spark, Python, and Java. - Excellent working knowledge of Hadoop components such as HDFS, HIVE, Impala, HBase, and Data frames. - Proficiency in Jenkins builds pipeline or other CI/CD tools. - Sound understanding of Data Warehousing principles and Data Modeling. Additionally, highly valued skills may include: - Experience with AWS services like S3, Athena, DynamoDB, Lambda, and DataBricks. - Working knowledge of Jenkins, Git, and Unix. Your performance may be assessed based on critical skills essential for success in this role, including risk and controls management, change and transformation capabilities, business acumen, strategic thinking, and proficiency in digital and technology aspects. This position is based in Pune. **Purpose of the Role:** The purpose of this role is to design, develop, and enhance software solutions using various engineering methodologies to deliver business, platform, and technology capabilities for our customers and colleagues. **Accountabilities:** - Develop and deliver high-quality software solutions using industry-aligned programming languages, frameworks, and tools. Ensure that the code is scalable, maintainable, and optimized for performance. - Collaborate cross-functionally with product managers, designers, and other engineers to define software requirements, devise solution strategies, and ensure seamless integration with business objectives. - Engage in peer collaboration, participate in code reviews, and promote a culture of code quality and knowledge sharing. - Stay updated on industry technology trends, contribute to the organization's technology communities, and foster a culture of technical excellence and growth. - Adhere to secure coding practices to mitigate vulnerabilities, protect sensitive data, and deliver secure software solutions. - Implement effective unit testing practices to ensure proper code design, readability, and reliability. **Assistant Vice President Expectations:** As an Assistant Vice President, you are expected to: - Provide consultation on complex issues, offering advice to People Leaders to resolve escalated matters. - Identify and mitigate risks, develop new policies/procedures to support the control and governance agenda. - Take ownership of risk management and control strengthening related to the work undertaken. - Engage in complex data analysis from various internal and external sources to creatively solve problems. - Communicate complex information effectively to stakeholders. - Influence or convince stakeholders to achieve desired outcomes. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as embrace the Barclays Mindset to Empower, Challenge, and Drive guiding principles for our behavior.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
As a skilled professional, you have hands-on experience in developing and maintaining JAVA-based applications. You are proficient in application development using Java and AWS microservices, with a solid understanding of Microservices architecture. Your expertise includes working with AWS services such as ECS, ELB, S3, CloudWatch, AppMesh, AWS Codebuild, and Codepipeline. Your excellent written and verbal communication skills enable you to effectively collaborate with team members and stakeholders. Additionally, you have experience in safe agile development practices, ensuring efficient and high-quality project delivery.,
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
Join us as a Data Engineer responsible for supporting the successful delivery of Location Strategy projects to plan, budget, agreed quality, and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Data Engineer, you should have experience with: - Hands-on experience in PySpark and strong knowledge of Dataframes, RDD, and SparkSQL. - Hands-on experience in developing, testing, and maintaining applications on AWS Cloud. - Strong hold on AWS Data Analytics Technology Stack (Glue, S3, Lambda, Lake Formation, Athena). - Design and implement scalable and efficient data transformation/storage solutions using Snowflake. - Experience in data ingestion to Snowflake for different storage formats such as Parquet, Iceberg, JSON, CSV, etc. - Experience in using DBT (Data Build Tool) with Snowflake for ELT pipeline development. - Experience in writing advanced SQL and PL SQL programs. - Hands-On Experience for building reusable components using Snowflake and AWS Tools/Technology. - Should have worked on at least two major project implementations. - Exposure to data governance or lineage tools such as Immuta and Alation is an added advantage. - Experience in using Orchestration tools such as Apache Airflow or Snowflake Tasks is an added advantage. - Knowledge of Abinitio ETL tool is a plus. Some other highly valued skills may include: - Ability to engage with stakeholders, elicit requirements/user stories, and translate requirements into ETL components. - Ability to understand the infrastructure setup and provide solutions either individually or working with teams. - Good knowledge of Data Marts and Data Warehousing concepts. - Possess good analytical and interpersonal skills. - Implement Cloud-based Enterprise data warehouse with multiple data platforms along with Snowflake and NoSQL environment to build data movement strategy. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, digital and technology, as well as job-specific technical skills. The role is based out of Chennai. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. Thorough understanding of the underlying principles and concepts within the area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for the end results of a team's operational processing and activities. Escalate breaches of policies/procedures appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision-making within the own area of expertise. Take ownership of managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility following relevant rules, regulations, and codes of conduct. Maintain and continually build an understanding of how your sub-function integrates with the function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
As an Engineering Manager at Airtel, you will lead and mentor the engineering team to develop products and services in alignment with Airtel's growth objectives. Your responsibilities will include problem-solving, architectural design, codebase maintenance, and working on both existing and new projects. You will be expected to provide technical guidance, establish best practices, and ensure end-to-end delivery of business solutions. In this role, you will also be responsible for people management, conducting performance reviews, providing feedback to team members, and developing succession plans. You will oversee resource allocation, recruitment, and hiring to meet organizational goals and deadlines effectively. Your duties will involve project management, contributing to the planning and execution of engineering projects, and ensuring timely delivery within budget constraints. Effective communication and collaboration with engineering teams, stakeholders, and other departments such as quality, operations, product, and program will be essential to ensure alignment with business objectives. Addressing conflicts and fostering a positive working environment will also be part of your responsibilities. Furthermore, you will be expected to promote a culture of innovation and continuous learning within your team, encouraging the adoption of new technologies and industry trends to enhance team efficiency. At Airtel, we work with various technologies such as Java, Tomcat, Netty, Springboot, Hibernate, Elasticsearch, Kafka, and web services. We also utilize caching technologies like Redis, Aerospike, or Hazelcast, along with data storage solutions including Oracle, S3, Postgres, MySQL, or MongoDB. Our tooling stack comprises Git, command line interfaces, Jenkins, JMeter, Postman, Gatling, Nginx/Haproxy, Jira/Confluence, Grafana, and Kibana.,
Posted 1 week ago
3.0 - 7.0 years
0 - 0 Lacs
karnataka
On-site
As an AWS Operations Manager at our company, you will be responsible for leading our Network Operations Center (NOC) and Security Operations Center (SOC) teams. Your main focus will be on developing and implementing cloud operations strategies that align with our business objectives, enhancing service reliability, and optimizing cloud resources. With a minimum of 7 years of IT experience, including at least 3 years in cloud operations (preferably AWS), you will bring your expertise in production operations for globally distributed cloud infrastructure. Your proven leadership skills in managing technical teams and projects will be essential in this role. Hands-on experience with AWS services such as VPC, EC2, EBS, RDS, ALB, ASG, IAM, S3, and Linux is required. You should also have a strong conceptual knowledge of DevOps practices, CI/CD pipelines, Infrastructure as Code (Terraform, CloudFormation), as well as familiarity with monitoring and managing complex production environments. Preferred qualifications include AWS certifications (Solutions Architect, DevOps Engineer, etc.), experience in regulated industries, multi-cloud and hybrid cloud experience, and ITIL certification. Your key responsibilities will include leading and mentoring DevOps, Site Reliability Engineering (SRE), and Quality Assurance (QA) team leads, fostering a culture of collaboration and continuous improvement within cloud operations teams, developing and implementing cloud operations strategies aligned with business objectives, driving continuous improvement in cloud infrastructure, incident management, ensuring security and compliance standards are met, optimizing cloud costs, establishing and tracking key performance indicators (KPIs), and serving as a key point of contact for cloud performance and operational metrics. If you are passionate about cloud operations, have strong leadership skills, and are looking for a challenging role where you can make a significant impact, then this position is perfect for you. Join us and be a part of our dynamic team where you can contribute to shaping the future of cloud operations.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
As a Data Engineer II at Media.net, you will be responsible for designing, executing, and managing large and complex distributed data systems. Your role will involve monitoring performance, optimizing existing projects, and researching and integrating Big Data tools and frameworks as required to meet business and data requirements. You will play a key part in implementing scalable solutions, creating reusable components and data tools, and collaborating with teams across the company to integrate with the data platform efficiently. The team you will be a part of ensures that every web page view is seamlessly processed through high-scale services, handling a large volume of requests across 5 million unique topics. Leveraging cutting-edge Machine Learning and AI technologies on a large Hadoop cluster, you will work with a tech stack that includes Java, Elastic Search/Solr, Kafka, Spark, Machine Learning, NLP, Deep Learning, Redis, and Big Data technologies such as Hadoop, HBase, and YARN. To excel in this role, you should have 2 to 4 years of experience in big data technologies like Apache Hadoop and relational databases (MS SQL Server/Oracle/MySQL/Postgres). Proficiency in programming languages such as Java, Python, or Scala is required, along with expertise in SQL (T-SQL/PL-SQL/SPARK-SQL/HIVE-QL) and Apache Spark. Hands-on knowledge of working with Data Frames, Data Sets, RDDs, Spark SQL/PySpark/Scala APIs, and deep understanding of Performance Optimizations will be essential. Additionally, you should have a good grasp of Distributed Storage (HDFS/S3), strong analytical and quantitative skills, and experience with data integration across multiple sources. Experience with Message Queues like Apache Kafka, MPP systems such as Redshift/Snowflake, and NoSQL storage like MongoDB would be considered advantageous for this role. If you are passionate about working with cutting-edge technologies, collaborating with global teams, and contributing to the growth of a leading ad tech company, we encourage you to apply for this challenging and rewarding opportunity.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, developing, and maintaining scalable web services and APIs using Python (Django/Flask). Your role will involve integrating with various AWS services (EC2, Lambda, S3, RDS, DynamoDB, etc.) to enable highly scalable cloud-based applications. It is essential to implement security best practices, including data encryption, identity management, and secure API development. Additionally, you will be tasked with designing, building, and optimizing database systems (SQL and NoSQL) to support high traffic and growth. In the realm of Front-End Development, you will be expected to build and maintain modern, responsive, and dynamic user interfaces using React.js and modern JavaScript (ES6). Collaboration with UX/UI designers to create pixel-perfect user experiences will also be a key aspect of your responsibilities. You will implement state management techniques using Redux, Context API, or other state management libraries and ensure seamless integration between front-end and back-end systems. In the domain of Cloud & DevOps, you will architect and deploy solutions on AWS, ensuring high availability, fault tolerance, and scalability of web applications. Utilizing AWS services like EC2, Lambda, RDS, S3, ECS, and CloudFront for application deployment, monitoring, and management will be part of your daily tasks. Leveraging infrastructure as code (IaC) tools like AWS CloudFormation, Terraform, or AWS CDK for managing cloud resources is also essential. Moreover, you will implement CI/CD pipelines for automated build, testing, and deployment, and provide mentorship and guidance to junior developers to ensure best practices and high code quality. Your role will involve leading code reviews, architecture discussions, and providing feedback on design and implementation. Working cross-functionally with product teams, quality assurance, and other stakeholders to deliver product features in an agile environment will be crucial. You will contribute to architectural decisions, system optimization, and performance improvements. The ideal candidate should possess 7+ years of experience in Python development (preferably Django, Flask, or FastAPI), expertise in RESTful API development and integration, a strong understanding of relational and NoSQL databases (PostgreSQL, MySQL, MongoDB, DynamoDB), familiarity with asynchronous programming and task queues (e.g., Celery), and 5+ years of experience with React.js, including hooks, Redux, and Context API for state management. Deep understanding of front-end technologies like HTML5, CSS3, JavaScript (ES6+), and responsive design principles, experience with modern build tools (Webpack, Babel, npm/Yarn), knowledge of testing frameworks like Jest, Mocha, or Cypress for front-end testing, and 5+ years of hands-on experience working with AWS services like EC2, S3, Lambda, RDS, DynamoDB, API Gateway, and CloudFormation are also required. Expertise in cloud architecture design, including high availability, fault tolerance, and auto-scaling, familiarity with Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation, and a strong understanding of CI/CD concepts and tools like Jenkins, CircleCI, GitLab CI are essential for this role.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing and implementing highly performant algorithms to process, transform, and analyze large volumes of data. You will apply advanced DSA concepts like Trees, Graphs, Tries, Heaps, and Hashing for data indexing, filtering, and routing. Additionally, you will develop and optimize data pipelines, stream processors, or caching systems, and architect scalable systems for data ingestion, storage, and retrieval (structured/unstructured). Collaboration with cross-functional teams to integrate and deploy performant services will be a key part of your role. You will also perform profiling, tuning, and memory optimization to ensure low-latency operations. Writing clean, modular, testable code and participating in code reviews will be essential responsibilities. To be successful in this role, you should have a strong command over Core DSA concepts such as Binary Search, Heaps, Graphs, Tries, and Trees (AVL, B-Trees, Segment Trees). Hands-on experience with algorithms for sorting, searching, indexing, and caching large datasets is required. Proficiency in one or more of the following languages is necessary: Java, Python. You should also have experience working with large datasets in real-time or batch, along with a solid grasp of time and space complexity and performance tuning. Familiarity with memory management, garbage collection, and data locality is important. Deep technical knowledge and hands-on experience in architecture design, development, deployment, and production operation are crucial. Familiarity with agile software development and modern development tools and frameworks is expected, along with strong engineering principles, including automation, quality, and best practices with a high bar. Extensive experience in the complete software development life cycle E2E, including production monitoring, will be beneficial. It would be good to have a broad understanding of Data Lakehouse formats like Apache Hudi, Apache Iceberg, or Delta Lake. Demonstrable experience in Spark programming and experience with Spark on DBT with AWS Glue or Apache Polaris is a plus. A broad understanding of cloud architecture tools and services, such as S3, EMR, Kubernetes, and Lambda functions, is desirable. Experience in AWS and Azure is also highly desirable. Rich experience and deep expertise in Big Data and large-scale data platforms, especially in Data Lake, would be advantageous.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
Are you passionate about building multi-tenant, cloud-native platforms Do you possess expertise in modern Java, microservices architecture, integration technologies, and frontend frameworks Join our team at Guidewire, where we are working on the Integration Gateway, Guidewire's cloud-native integration platform. In this role, you will collaborate in a dynamic environment to develop features, enhancements, and bug fixes for the platform that enables Guidewire customers to connect their cloud systems. What you would do: - Design, develop, and operate cloud-native integration platform and SaaS services - Engage in hands-on coding for more than 90% of your time - Take ownership of Continuous Integration (CI) and Continuous Deployment (CD) processes for your services - Ensure scalability, availability, and data security of your services - Identify and resolve code defects - Uphold secure coding practices and address application security vulnerabilities What You Would Need To Succeed: - Minimum of 6+ years of relevant work experience - Proficient programming skills in Java - Familiarity with the Apache Camel integration framework is a bonus - Strong background in Java, Spring Boot, microservices, multithreading, and AWS (or any public cloud) - Deep comprehension of Algorithms, Data Structures, and performance optimization strategies - Working knowledge of Kubernetes, AWS, and Docker - Experience with AWS DynamoDB - Additional experience with React.js, SQS, S3, and Kafka is advantageous - Understanding of distributed systems concepts and principles (e.g., consistency and availability, liveness and safety, durability, reliability, fault-tolerance, consensus algorithms) - Ability to thrive in an agile, fast-paced work environment - Bachelor's or Master's degree in Computer Science or equivalent Guidewire is the trusted platform for P&C insurers seeking to effectively engage, innovate, and grow. Our platform, offered as a cloud service, integrates digital, core, analytics, and AI capabilities. Over 540 insurers in 40 countries, ranging from startups to the most intricate enterprises globally, rely on Guidewire. As a partner to our clients, we continuously evolve to facilitate their achievements. We take pride in our exceptional track record of over 1600 successful projects, backed by the industry's largest R&D team and partner network. Our Marketplace features numerous applications that expedite integration, localization, and innovation. For more details, visit www.guidewire.com and connect with us on Twitter: @Guidewire_PandC.,
Posted 1 week ago
7.0 - 12.0 years
25 - 35 Lacs
ahmedabad
Remote
Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implement and enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Requirements: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15yearsofexperienceindataarchitecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, Five Tran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.
Posted 1 week ago
10.0 - 17.0 years
27 - 35 Lacs
noida
Work from Office
Strong hands-on exp. with .NET Core and C#. Design & implement scalable, robust, & secure solutions using .NET Core on AWS Cloud. Collaborate with product managers, developers,& DevOps teams to align architecture with business requirements. AWS cloud Required Candidate profile Exp. AWS Cloud Services, such as EC2, Lambda, S3, RDS, API Gateway, CloudFormation. Define &document architectural patterns, technical standards, &best practices.DevOps tools & CI/CD pipelines
Posted 1 week ago
3.0 - 5.0 years
7 - 11 Lacs
bengaluru
Work from Office
Design and develop scalable web applications using JavaScript, Node.js, React.js/Angular/Vue.js. Build robust backend APIs and microservices using Node.js / Express.js. / Python. Develop and maintain cloud-native applications using AWS services like Lambda, API Gateway, S3, EC2, DynamoDB, RDS, CloudFormation, etc. Participate in code reviews and enforce best practices in JavaScript development and cloud architecture. Integrate front-end UI with backend services and ensure performance and responsiveness. Implement CI/CD pipelines using tools like AWS CodePipeline, GitLab CI/CD, Jenkins, etc. Monitor and troubleshoot production issues and ensure reliability and scalability. Collaborate with cross-functional teams including DevOps, QA, UI/UX, and Product Management. Technical Skills: Must Have: Proficiency in JavaScript, Node.js, Express.js, Python Hands-on experience in React.js or Angular or Vue.js Solid understanding of RESTful APIs, JSON, JWT, WebSockets Strong knowledge of AWS services Lambda, API Gateway, DynamoDB, S3, CloudWatch, EC2, CloudFormation, IAM, etc. Experience with Databases: SQL (PostgreSQL/MySQL) and NoSQL (DynamoDB/MongoDB) Familiarity with DevOps tools CI/CD, Docker, Git, etc. Good to Have: AWS Developer or Architect Certification Experience with serverless architecture Knowledge of Agile/Scrum methodologies Unit testing using Mocha, Jest or Jasmine Qualifications: Bachelor's/Masters degree in Computer Science, Engineering, or related field. AWS Certified Developer / AWS Certified Solutions Architect Associate (Mandatory or Strongly Preferred).
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |