Jobs
Interviews

867 Lambda Expressions Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Title - Lead Data Architect (Streaming) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Strong experience with Confluent Strong experience in Kafka Solid understanding of data streaming architectures and best practices Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Knowledge of Apache Airflow for data orchestration Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications An understanding of cloud networking patterns and practises Experience with working on a library or other long term product Knowledge of the Flink ecosystem Experience with Terraform Deep experience with CI/CD pipelines Strong understanding of the JVM language family Understanding of GDPR and the correct handling of PII Expertise with technical interface design Use of Docker Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem Architect data processing applications using Python, Kafka, Confluent Cloud and AWS Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Computer Science, Consulting, Technology

Posted 3 months ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleLead Data Architect (Warehousing) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Proficiency in Python Solid understanding of data warehousing architectures and best practices Strong Snowflake skills Strong Data warehouse skills Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Experience of data cataloguing Knowledge of Apache Airflow for data orchestration Experience modelling, transforming and testing data in DBT Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications Familiarity with Atlan for data catalog and metadata management Experience integrating with IBM MQ Familiarity with Sonarcube for code quality analysis AWS certifications (e.g., AWS Certified Solutions Architect) Experience with data modeling and database design Knowledge of data privacy regulations and compliance requirements An understanding of Lakehouses An understanding of Apache Iceberg tables SnowPro Core certification Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, as well as Snowflake, DBT and Apache Airflow, all within a larger and overarching programme ecosystem Develop data ingestion, processing, and storage solutions using Python and AWS Lambda and Snowflake Architect data processing applications using Python Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Ensure data security and implement best practices using tools like Synk Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Solution Architect, Data Warehouse, Computer Science, Database, Technology

Posted 3 months ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Req ID: 306668 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Job TitleLead Data Engineer (Warehouse) Required Skills and Qualifications - 7+ years of experience in data engineering of which atleast 3+ years as lead / managed team of 5+ data engineering team. - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification - Bachelor's degree in Computer Science, Engineering, or related field Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Position Overview: We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Computer Science, Database, SQL, Consulting, Technology

Posted 3 months ago

Apply

4 - 9 years

16 - 20 Lacs

Bengaluru

Work from Office

Req ID: 301930 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleData Solution Architect Position Overview: We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Preferred Qualifications - Experience with Kafka Connect and Confluent Schema Registry - Familiarity with Atlan for data catalog and metadata management - Knowledge of Apache Flink for stream processing - Experience integrating with IBM MQ - Familiarity with Sonarcube for code quality analysis - AWS certifications (e.g., AWS Certified Solutions Architect) - Experience with data modeling and database design - Knowledge of data privacy regulations and compliance requirements Key Responsibilities - Design and implement scalable data architectures using AWS services and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trend About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team. Job Segment Solution Architect, Consulting, Database, Computer Science, Technology

Posted 3 months ago

Apply

4 - 9 years

18 - 30 Lacs

Hyderabad, India

Hybrid

Department: Software Engineering Employment Type: Full Time Location: India Reporting To: Manoj Puranik Description At Vitech, we believe in the power of technology to simplify complex business processes. Our mission is to bring better software solutions to market, addressing the intricacies of the insurance and retirement industries. We combine deep domain expertise with the latest technological advancements to deliver innovative, user-centric solutions that future-proof and empower our clients to thrive in an ever-changing landscape. With over 1,600 talented professionals on our team, our innovative solutions are recognized by industry leaders like Gartner, Celent, Aite-Novarica, and ISG. We offer a competitive compensation package along with comprehensive benefits that support your health, well-being, and financial security. Location: Hyderabad (Hybrid Role) Role: Full-Stack Java Developer Are you a Java Developer with 4-7+ years of experience eager to elevate your career? At Vitech, we’re looking for a talented professional with a solid background in Core Java who’s ready to make a significant impact. As a Full-Stack Developer at Vitech, you’ll dive deep into backend development while also contributing to frontend work with ReactJS / GWT. Our small, agile pods allow you to spend up to 40% of your time on innovation and writing new software, pushing our products forward. What you will do: Lead and contribute to the full software development lifecycle —from design and coding to testing , deployment , and support . Apply advanced Core Java concepts such as inheritance , interfaces , and abstract classes to solve complex business challenges . Develop and maintain applications across the full stack , with a strong focus on backend development in Java and frontend work using ReactJS or GWT . Collaborate with a cross-functional, high-performing team to deliver scalable , customer-centric solutions . Drive innovation by designing and building software that fuels product enhancements and supports business growth . What We're Looking For: A dvanced Core Java skills with deep expertise in object-oriented programming concepts like inheritance , interfaces , abstract/concrete classes , and control structures Ability to apply these principles to solve complex, business-driven challenges Proficient SQL knowledge with the ability to write and optimize complex queries in relational databases Hands-on experience with Spring Boot , Spring MVC , and Hibernate for backend development Familiarity with REST APIs and microservices architecture Frontend development experience using ReactJS , Angular , or GWT , with the ability to build responsive , user-friendly interfaces and integrate them in a full-stack environment Experience with AWS services such as EC2 , S3 , RDS , Lambda , API Gateway , CloudWatch , and IAM is a plus Strong analytical and problem-solving skills Experience in technical leadership or mentoring is preferred Excellent communication and collaboration skills A commitment to clean, maintainable code and a passion for continuous learning Join Us at Vitech! Career Development: At Vitech, we’re committed to your growth. You’ll have ample opportunities to deepen your expertise in both Java and ReactJS, advancing your career in a supportive environment. Innovative Environment: Work with cutting-edge technologies in an Agile setting where your ideas and creativity are welcomed and encouraged. Impactful Work: Your contributions will be crucial in shaping our products and delivering exceptional solutions to our global clients. At Vitech, you’re not just maintaining software but creating it. At Vitech, you’ll be part of a forward-thinking team that values collaboration, innovation, and continuous improvement. We provide a supportive and inclusive environment where you can grow as a leader while helping shape the future of our organization.

Posted 3 months ago

Apply

3 - 5 years

14 - 18 Lacs

Pune

Work from Office

In this role, you will maintain and enhance frontend services, working with AngularJS to support existing user-facing features while contributing to modernization efforts using React. Collaboration is key, as you'll work closely with designers, backend developers, and product managers to integrate systems involving XHTML/JSF-based UIs and Java backend frameworks like Spring and JBoss-Seam. You'll play a pivotal role in ensuring the technical feasibility of UI/UX designs while optimizing applications for performance, scalability, and accessibility. A strong emphasis on code quality is essential, requiring you to write clean, maintainable, and well-documented code that adheres to industry best practices. Additionally, you'll stay informed about emerging technologies and industry trends, continuously learning and applying this knowledge to improve the system and processes . Required Experience Skills 5 to 8 years' experience in a relevant software development role Frontend Frameworks : Proficiency in AngularJS; familiarity with React is a plus. Web Development : Expertise in HTML5, CSS3, JavaScript, and TypeScript. Backend Integration : Experience with Node.js, Express.js, and API-driven development. Build Tools and Testing : Familiarity with task automation tools like Grunt and unit testing frameworks like Karma. Containerization : Understanding of Docker for application deployment. AWS Services : Working knowledge of AWS services such as S3, Lambda, and CloudFront. Soft Skills : Strong communication, problem-solving skills, and a collaborative mindset. Preferred Skills Agile Practices : Familiarity with agile development methodologies. Testing Frameworks : Proficiency with Jest or Mocha for testing. Advanced AWS Services : Familiarity with CloudWatch, DynamoDB, API Gateway, AppSync, Route 53, CloudTrail, WAF, and X-Ray

Posted 3 months ago

Apply

3 - 6 years

20 - 25 Lacs

Hyderabad

Work from Office

Overview Job Title: Senior DevOps Engineer Location: Bangalore / Hyderabad / Chennai / Coimbatore Position: Full-time Department: Annalect Engineering Position Overview Annalect is currently seeking a Senior DevOps Engineer to join our technology team remotely, We are passionate about building distributed back-end systems in a modular and reusable way. We're looking for people who have a shared passion for data and desire to build cool, maintainable and high-quality applications to use this data. In this role you will participate in shaping our technical architecture, design and development of software products, collaborate with back-end developers from other tracks, as well as research and evaluation of new technical solutions. Responsibilities Key Responsibilities: Build and maintain cloud infrastructure through terraform IaC. Cloud networking and orchestration with AWS (EKS, ECS, VPC, S3, ALB, NLB). Improve and automate processes and procedures. Constructing CI/CD pipelines. Monitoring and handling incident response of the infrastructure, platforms, and core engineering services. Troubleshooting infrastructure, network, and application issues. Help identify and troubleshoot problems within environment. Qualifications Required Skills 5 + years of DevOps experience 5 + years of hands-on experience in administering cloud technologies on AWS, especially with IAM, VPC, Lambda, EKS, EC2, S3, ECS, CloudFront, ALB, API Gateway, RDS, Codebuild, SSM, Secret Manager, Lambda, API Gateway etc. Experience with microservices, containers (Docker), container orchestration (Kubernetes). Demonstrable experience of using Terraform to provision and configure infrastructure. Scripting ability - PowerShell, Python, Bash etc. Comfortable working with Linux/Unix based operating systems (Ubuntu preferred) Familiarity with software development, CICD and DevOps tools (Bitbucket, Jenkins, GitLab, Codebuild, Codepipeline) Knowledge of writing Infrastructure as Code (laC) using Terraform. Experience with microservices, containers (Docker), container orchestration (Kubernetes), serverless computing (AWS Lambda) and distributed/scalable systems. Possesses a problem-solving attitude. Creative, self-motivated, a quick study, and willing to develop new skills. Additional Skills Familiarity with working with data and databases (SQL, MySQL, PostgreSQL, Amazon Aurora, Redis, Amazon Redshift, Google BigQuery). Knowledge of Database administration. Experience with continuous deployment/continuous delivery (Jenkins, Bamboo). AWS/GCP/Azure Certification is a plus. Experience in python coding is welcome. Passion for data-driven software. All of our tools are built on top of data and require work with data. Knowledge of laaS/PaaS architecture with good understanding of Infrastructure and Web Application security Experience with logging/monitoring (CloudWatch, Datadog, Loggly, ELK). Passion for writing good documentation and creating architecture diagrams.

Posted 3 months ago

Apply

4 - 6 years

25 - 27 Lacs

Bengaluru

Work from Office

Overview Annalect is currently seeking a senior developer to join our technology team. In this role, you will contribute to the design and development of intuitive front-end applications and distributed backend microservices. We are passionate about modular reusable software architecture. We are looking for people who have a shared passion for developing and building cool reusable user interfaces and services. In this role you will contribute to the technical architecture of the product as well as research and evaluation of new technical solutions while coordinating between interdisciplinary teams to help shape the perfect solution for us and our agencies. Responsibilities Development and unit testing of web application including front-end (SPA) and back-end (microservices), maintenance & support of the same. Provide assistance to Project Managers and Technical Leads in the planning of projects (eg provision of estimates, risk analysis, requirements analysis, technical options) Involvement in full life cycle of projects (including requirement analysis and system design, development and support if required) Support and work collaboratively with teams across areas of design, development, quality assurance and operations Commit your knowledge and experience to team success. To be a knowledge keeper for product, its architecture, design, and implementation details Provide overall mentorship, coaching and on-demand trainings to improve and unify development style. Qualifications 5 - 7 years in application development. Understanding the sense of OOP/OOD/DDD and design patterns. .Net Full Stack, Angular 11+, .Net Core 6+, Web API TypeScript, Jest , NodeJs, OAuth2 Database experience (Sql Server) and ORM technologies (LINQ, EF or similar). NoSql - MongoDB AWS Experience – Lambda, Step Function, S3, ECS, RDS, SQS, Elasticsearch Performance optimization Security design and implementation. CI/CD practices Docker – good to have

Posted 3 months ago

Apply

2 - 5 years

14 - 17 Lacs

Hyderabad

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations.. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala"

Posted 3 months ago

Apply

10.0 - 15.0 years

4 - 8 Lacs

navi mumbai

Work from Office

Role Overview:- The Tech Architect AWS AI (Anthropic) will be responsible for translating AI solution requirements into scalable and secure AWS-native architectures. This role combines architectural leadership with hands-on technical depth in GenAI model integration, data pipelines, and deployment using Amazon Bedrock and Claude models. The ideal candidate will bridge the gap between strategic AI vision and engineering execution while ensuring alignment with enterprise cloud and security standards. Key Responsibilities: - Design robust, scalable architectures for GenAI use cases using Amazon Bedrock and Anthropic Claude. Lead architectural decisions involving model orchestration, prompt optimization, RAG pipelines, and API integration. Define best practices for implementing AI workflows using SageMaker, Lambda, API Gateway, and Step Functions. Review and validate implementation approaches with tech leads and developers; ensure alignment with architecture blueprints. Contribute to client proposals, solution pitch decks, and technical sections of RFP/RFI responses. Ensure AI solutions meet enterprise requirements for security, privacy, compliance, and performance. Collaborate with cloud infrastructure, data engineering, and DevOps teams to ensure seamless deployment and monitoring. Stay updated on AWS Bedrock advancements, Claude model improvements, and best practices for GenAI governance. Required Skills and Competencies: - Deep hands-on experience with Amazon Bedrock, Claude (Anthropic), Amazon Titan, and embedding-based workflows. Proficient in Python and cloud-native API development; experienced with JSON, RESTful integrations, and serverless orchestration. Strong understanding of SageMaker (model training, tuning, pipelines), real-time inference, and deployment strategies. Knowledge of RAG architectures, vector search (e.g., OpenSearch, Pinecone), and prompt engineering techniques. Expertise in IAM, encryption, access control, and responsible AI principles for secure AI deployments. Ability to create and communicate high-quality architectural diagrams and technical documentation. Desirable Qualifications: AWS Certified Machine Learning Specialty and/or AWS Certified Solutions Architect Professional. Familiarity with LangChain, Haystack, Semantic Kernel in AWS context. Experience with enterprise-grade GenAI use cases such as intelligent search, document summarization, conversational AI, and code copilots. Exposure to integrating third-party model APIs and services available via AWS Marketplace. Soft Skills: Strong articulation and technical storytelling capabilities for client and executive conversations. Proven leadership in cross-functional project environments with globally distributed teams. Analytical mindset with a focus on delivering reliable, maintainable, and performant AI solutions. Self-driven, curious, and continuously exploring innovations in GenAI and AWS services .

Posted Date not available

Apply

5.0 - 8.0 years

3 - 7 Lacs

bengaluru

Work from Office

Scope and Responsibilities: As a Senior Engineer with a focus on MFT Axway administration and development, you will: Set up file transfer configuration for customer in Axway Managed File transform platform. Take ownership of platform maintenance activities which include but not limited to OS, Application Patching and operational automation. Diagnosing and troubleshooting file transfer and platform issues. Analyze and report platform related issues with respective product vendor. Flexible to provide regular on-call technical support (US/Pacific hours in Weekday and Weekends). Build automations which are deemed necessary for improving security, reliability, maintainability, availability and performance for MFT platform. Participate on building self-service capabilities for customer. Required Qualifications Bachelors' degree in Computer Science or a related field. 5-8 years additional relevant professional experience. Detailed understanding and experience with the Axway Secure Transport product. Good understanding of AWS technologies including but not limited to EC2, Lambda, S3, AWS RDS, EC2 Oracle. Hands on experience with Terraform for managing AWS resources. Experience with CI/CD tools like Jenkins and RunDeck. Strong Shell and/or Python development experience including expertise in REST/JSON APIs. Proficient understanding of code versioning using Git (GitHub). Good hands on experience in Linux and database administration. Strong debugging skills. Communicating with vendors and customers in a courteous and professional manner. Great communications skills. Desired Qualifications Experience with participating in projects in a highly collaborative, multi-discipline development team environment Exposure to Agile, ideally a strong background with the SAFe methodology Skill set on any monitoring or observability tool will be a value add. Basic understanding of batch scheduling tools like autosys Java programing Mandatory Skills: Axway Product.Experience: 5-8 Years.

Posted Date not available

Apply

7.0 - 12.0 years

19 - 22 Lacs

bengaluru

Work from Office

Bengaluru, India Technical Support (SL2) BCM Industry 14/05/2025 Project description Luxoft is one of the leading service provider for Banking and Capital Market customers. Luxoft has been engaged by an large Australian bank for providing L1/L2 Application Monitoring and Production Support services for business-critical applications and interfaces on 24/5 basis on a managed outcome basis in the Global Markets business area. We are looking for motivated individuals who have relevant skills & experience and are willing to work in shifts. Responsibilities Develop and maintain Unix shell scripts for automation tasks. Write and optimize Python scripts for process automation and data handling. Design, implement, and maintain scalable cloud infrastructure using AWS services (EC2, S3, Lambda, etc.). Monitor and troubleshoot cloud environments for optimal performance. Monitor and optimize system resources and automate routine administrative tasks and BAU tasks. Production Environment monitoring & Issue Resolution. Control SLA and notify management or the client in case of unexpected behavior. Support end-to-end data flows and health and sanity checks of the systems and applications. Escalate the issues (internally to Group lead/PM) with environment and application health. Logs review and data discovery in database tables for investigation of workflow failures. Investigate and supply analysis to fix application/configuration issues in the production environment. Contact/chase responsible support/upstream/downstream/cross teams and ask for root cause analysis from them on issues preventing end-to-end flow to work as designed. Regular update on issue status until addressed, notifying the client on status changes; expected time to address. Participate in ad-hoc/regular status calls on application health with the client to discuss critical defects/health check status. Working with business users service requests, which includes investigation of business logic and application behavior. Work with different data format transformation processes (XML, Pipeline). Work with source control tools (GIT/SVN) in order to investigate configuration or data transformation-related issues. Work with middleware and schedulers on data flow and batch process control. Focus on continuous proactive service improvement and continuous learning. Ensure customer service excellence and guaranteed response within SLA timeline by actively monitoring support emails/tickets and actively working on them till the issue is fully remediated. Ensuring all incident tickets are resolved in a timely and comprehensive manner. Track and identify frequently occurring, high-impact support issues as candidates for permanent resolution. Bachelor's Degree from a reputed university with good passing scores. Skills Must have 7 to 12 years as a L2/L3 Production Support along with Site Reliability Engineer having strong knowledge of Unix shell scripting Develop and maintain Unix shell scripts for automation tasks. Write and optimize Python or Shell scripts for process automation and data handling. Good knowledge of any scripting language would be fine. Basic Knowledge on AWS services (EC2, S3, etc.). Monitor and optimize system resources and automate routine administrative tasks and BAU tasks. Good Understanding of Incident/Change/Problem Management process. Required Skills: Strong experience with Unix Shell Scripting. Proficiency in Python Scripting for automation. Proficiency in any scripting language and have hands-on experience in automation. Strong Knowledge of Database Basic understanding of AWS services and cloud Basic knowledge and experience supporting cloud applications. Ability to troubleshoot and resolve technical issues in a Production Environment. Nice to have Preferred Skills (Optional): Experience with containers (Docker, Kubernetes). Familiarity with CI/CD pipelines, version control systems (e.g., Git). Knowledge of Infrastructure-as-Code tools like Terraform. Strong problem-solving and communication skills. OtherLanguagesHindiB1 Intermediate,EnglishC1 Advanced SenioritySenior

Posted Date not available

Apply

4.0 - 9.0 years

6 - 15 Lacs

bengaluru

Work from Office

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an?agile environment. ?The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact! Your Role and Responsibilities: Designs, develops and supports applications solutions with focus on HANA version of Advanced Business Application Programming (ABAP). This specialty may design, develop and/or re-engineer highly complex application components, and integrate software packages, programs and reusable objects residing on multiple platforms. This specialty may additionally have working knowledge of SAP HANA Technical Concept and Architecture, Data Modelling using HANA Studio, ABAP Development Tools (ADT), Code Performance Rules and Guidelines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live models consumption Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 4 -12 years of experience required. The ABAP on HANA Application Developers would possess the knowledge of the following topics and apply them to bring in value and innovation to client engagements: SAP HANA Technical Concept and Architecture, Data Modelling using HANA Studio, ABAP Development Tools (ADT), Code Performance Rules and Guidelines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live models consumption. Designing and developing, data dictionary objects, data elements, domains, structures, views, lock objects, search helps and in formatting the output of SAP documents with multiple options. Modifying standard layout sets in SAP Scripts, Smart forms & Adobe Forms Development experience in RICEF (Reports, Interfaces, Conversions, Enhancements, Forms and Reports) Preferred technical and professional experience Experience in working in Implementation, Upgrade, Maintenance and Post Production support projects would be an advantage Understanding of SAP functional requirement, conversion into Technical design and development using ABAP Language for Report, Interface, Conversion, Enhancement and Forms in implementation or support projects

Posted Date not available

Apply

9.0 - 14.0 years

30 - 35 Lacs

bengaluru

Work from Office

About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted Date not available

Apply

5.0 - 9.0 years

8 - 13 Lacs

noida

Work from Office

About the Role: Grade Level (for internal use): 11 T he Team Dividend Forecasting (DF) provide discrete forecasts for over 28,000 stocks worldwide, supported by regional insights from our team of 40 dividend analysts and leveraging our Advanced Analytics Dividend Forecasting Model. Our Dividend Forecasting service provides timely data, insights and commentary to help financial institutions price derivatives, enhance investment decisions and manage risks. The team, located in India, US, Singapore , and the UK, is currently focused on working on strategic projects and BAU in a cloud-native architecture to enhance scalability and efficiency. The Impact Scalability and Performance Your role will be key in designing the platform and project for cloud-native architecture, adding product growth and optimizing performance to handle increased data loads. Innovation and Efficiency You will implement advanced cloud technologies and best practices, streamlining processes to enhance product features, operational efficiency and accelerate service delivery. Whats in it for you Joining this role presents a unique opportunity for professional development and skill enhancement in a dynamic and innovative environment. Cloud Development Skills: Gain expertise in cloud-native technologies and frameworks, enhancing your proficiency in modern application development. Cross-Functional Collaboration Work closely with diverse teams across regions, improving your collaboration and communication skills in a global environment. Agile Methodologies Experience working in an Agile development environment, allowing you to adapt quickly to changes and improve project management skills. Data Management and Analytics Develop skills in managing and analyzing large datasets, which is crucial for optimizing performance and driving data-driven decisions. Responsibilities Design, develop, and maintain cloud-based applications using Java based stack . Migrate legacy components to modern cloud architecture, ensuring scalability, reliability, and security. Implement and manage AWS cloud technologies, focusing on commonly used services such as EC2, S3, Lambda, and RDS. Take ownership of projects, from concept to delivery, ensuring adherence to project timelines and objectives . Work as part of an agile team to identify and deliver solutions to prioritized requirements Must demonstrate strong expertise in system design, architectural patterns, and building efficient, scalable systems. What Were Looking For Bachelors degree in computer science , Engineering, or related field. 5 to 9 years of experience in software development with hands on experience on Java , Sprin g and Angular . Hands on experience and knowledge on AWS (S3, Lambda, Step functions, SNS, SQS, RDS , ECS, Others ) Hands on experience and knowledge on RDBMS ( MS SQL Server , PostgreSQL , Oracle , Others ) Excellent problem-solving skills and the ability to work independently or as part of a team. Exceptional communication skills, with the ability to articulate technical concepts to non-technical stakeholders. Good to have knowledge of Python language. Basic Qualifications Bachelors degree in computer science , Engineering, or related field. Preferred Qualifications B Tech in computer science , IT, Engineering, or related field. MCA Computer science.

Posted Date not available

Apply

6.0 - 10.0 years

9 - 13 Lacs

mumbai, bengaluru

Work from Office

Job Title : Snowflake Developer with Oracle Golden Gate/ Data Engineer About Oracle FSGIU - Finergy: The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals Responsibilities: Snowflake Data Modeling & Architecture: Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology. Real-Time Data Replication & Ingestion: Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion. Cloud Integration & Management: Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions. Data Sharing & Security: Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption. CI/CD Implementation: Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes. Collaboration & Troubleshooting: Partner with cross-functional teams to address data-related challenges and optimize performance. Documentation & Best Practices: Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations. Performance Optimization: Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines. Mandatory Skills: Should have 4 years of experience as Data Engineer Strong expertise in Snowflake architecture, data modeling, and query optimization . Proficiency in SQL for writing and optimizing complex queries. Hands-on experience with Oracle GoldenGate for Big Data for real-time data replication. Knowledge of Snowpipe for automated data ingestion. Familiarity with AWS cloud services (S3, EC2, Lambda, IAM) and their integration with Snowflake. Experience with CI/CD tools (e.g., Jenkins, GitLab) for automating workflows. Working knowledge of Snowflake Data Vault methodology . Good to Have Skills: Exposure to Databricks for data processing and analytics. Knowledge of Python or Scala for data engineering tasks. Familiarity with Terraform or CloudFormation for infrastructure as code (IaC). Experience in data governance and compliance best practices . Understanding of ML and AI integration with data pipelines . Self-Test Questions: Do I have hands-on experience in designing and optimizing Snowflake data models? Can I confidently set up and manage real-time data replication using Oracle GoldenGate? Have I worked with Snowpipe to automate data ingestion processes? Am I proficient in SQL and capable of writing optimized queries in Snowflake? Do I have experience integrating Snowflake with AWS cloud services? Have I implemented CI/CD pipelines for Snowflake development? Can I troubleshoot performance issues in Snowflake and optimize queries effectively? Have I documented data engineering processes and best practices for team collaboration?

Posted Date not available

Apply

3.0 - 5.0 years

7 - 11 Lacs

mumbai, bengaluru

Work from Office

Job Title : Snowflake Developer with Oracle Golden Gate/ Data Engineer About Oracle FSGIU - Finergy: The Finergy division within Oracle FSGIU is dedicated to the Banking, Financial Services, and Insurance (BFSI) sector. We offer deep industry knowledge and expertise to address the complex financial needs of our clients. With proven methodologies that accelerate deployment and personalization tools that create loyal customers, Finergy has established itself as a leading provider of end-to-end banking solutions. Our single platform for a wide range of banking services enhances operational efficiency, and our expert consulting services ensure technology aligns with our clients' business goals Responsibilities: Snowflake Data Modeling & Architecture: Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology. Real-Time Data Replication & Ingestion: Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion. Cloud Integration & Management: Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions. Data Sharing & Security: Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption. CI/CD Implementation: Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes. Collaboration & Troubleshooting: Partner with cross-functional teams to address data-related challenges and optimize performance. Documentation & Best Practices: Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations. Performance Optimization: Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines. Mandatory Skills: Should have 4 years of experience as Data Engineer Strong expertise in Snowflake architecture, data modeling, and query optimization . Proficiency in SQL for writing and optimizing complex queries. Hands-on experience with Oracle GoldenGate for Big Data for real-time data replication. Knowledge of Snowpipe for automated data ingestion. Familiarity with AWS cloud services (S3, EC2, Lambda, IAM) and their integration with Snowflake. Experience with CI/CD tools (e.g., Jenkins, GitLab) for automating workflows. Working knowledge of Snowflake Data Vault methodology . Good to Have Skills: Exposure to Databricks for data processing and analytics. Knowledge of Python or Scala for data engineering tasks. Familiarity with Terraform or CloudFormation for infrastructure as code (IaC). Experience in data governance and compliance best practices . Understanding of ML and AI integration with data pipelines . Self-Test Questions: Do I have hands-on experience in designing and optimizing Snowflake data models? Can I confidently set up and manage real-time data replication using Oracle GoldenGate? Have I worked with Snowpipe to automate data ingestion processes? Am I proficient in SQL and capable of writing optimized queries in Snowflake? Do I have experience integrating Snowflake with AWS cloud services? Have I implemented CI/CD pipelines for Snowflake development? Can I troubleshoot performance issues in Snowflake and optimize queries effectively? Have I documented data engineering processes and best practices for team collaboration? Responsibilities Responsibilities: Snowflake Data Modeling & Architecture: Design and implement scalable Snowflake data models using best practices such as Snowflake Data Vault methodology. Real-Time Data Replication & Ingestion: Utilize Oracle GoldenGate for Big Data to manage real-time data streaming and optimize Snowpipe for automated data ingestion. Cloud Integration & Management: Work with AWS services (S3, EC2, Lambda) to integrate and manage Snowflake-based solutions. Data Sharing & Security: Implement SnowShare for data sharing and enforce security measures such as role-based access control (RBAC), data masking, and encryption. CI/CD Implementation: Develop and manage CI/CD pipelines for Snowflake deployment and data transformation processes. Collaboration & Troubleshooting: Partner with cross-functional teams to address data-related challenges and optimize performance. Documentation & Best Practices: Maintain detailed documentation for data architecture, ETL processes, and Snowflake configurations. Performance Optimization: Continuously monitor and enhance the efficiency of Snowflake queries and data pipelines. Qualifications Career Level - IC2

Posted Date not available

Apply

3.0 - 8.0 years

5 - 9 Lacs

bengaluru

Work from Office

Educational Requirements Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities - Managing large machine learning applications and designing and implementing new frameworks to build scalable and efficient data processing workflows and machine learning pipelines.- Build the tightly integrated pipeline that optimizes and compiles models and then orchestrates their execution.- Collaborate with CPU, GPU, and Neural Engine hardware backends to push inference performance and efficiency- Work closely with feature teams to facilitate and debug the integration of increasingly sophisticated models, including large language models- Automate data processing and extraction- Engage with sales team to find opportunities, understand requirements, and translate those requirements into technical solutions.- Develop reusable ML models and assets into production. Technical and Professional Requirements: - Excellent Python programming and debugging skills. (Refer to Pytho JD given below)- Proficiency with SQL, relational databases, & non-relational databases- Passion for API design and software architecture.- Strong communication skills and the ability to naturally explain difficult technical topics to everyone from data scientists to engineers to business partners- Experience with modern neural-network architectures and deep learning libraries (Keras, TensorFlow, PyTorch). - Experience unsupervised ML algorithms. - Experience in Timeseries models and Anomaly detection problems.- Experience with modern large language model (Chat GPT/BERT) and applications.- Expertise with performance optimization.- Experience or knowledge in public cloud AWS services - S3, Lambda.- Familiarity with distributed databases, such as Snowflake, Oracle.- Experience with containerization and orchestration technologies, such as Docker and Kubernetes. Preferred Skills: Technology->Big Data - Data Processing->Spark Technology->Machine Learning->R Technology->Machine Learning->Python

Posted Date not available

Apply

4.0 - 9.0 years

22 - 25 Lacs

hyderabad

Work from Office

4-5 Years exp with AWS with Cloud Practioner certification Exp in working with Cloud Formation to create AWS components (Like IAM roles, Lambdas, Event Bridge etc..) Exp in working with Terraform to create cloud components (Like setting up S3 related setup, permissions, AWS batch related config etc) Working exp in creating Lambdas using Java and Python (This is also more like an app development using Lambdas vs core infrastructure level tasks) Working exp with AWS batch using Java and Python Good to have: Experience with Appflow and Event bridge (Writing event rules) Experience in integrating with external applications like Salesforce

Posted Date not available

Apply

5.0 - 10.0 years

14 - 17 Lacs

mumbai

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted Date not available

Apply

4.0 - 9.0 years

5 - 9 Lacs

hyderabad

Work from Office

Job Purpose As a Senior Java/AWS Developer, you will be part of a team responsible for contributing to the design, development, maintenance and support of ICE Digital Trade, a suite of highly configurable enterprise applications. The ideal candidate must be results-oriented, self-motivated and can thrive in a fast-paced environment. This role requires frequent interactions with project and product managers, developers, quality assurance and other stakeholders, to ensure delivery of a world class application to our users. Responsibilities Reviewing application requirements and interface designs. Contributing to the design and development of enterprise Java applications Developing and implementing highly responsive user interface components using react concepts. Writing application interface codes using JavaScript following react.js workflows. Troubleshooting interface software and debugging application code. Developing and implementing front-end architecture to support user interface concepts. Monitoring and improving front-end performance. Documenting application changes and developing updates. Collaborate with QA team to ensure quality production code. Support and enhance multiple mission-critical enterprise applications. Write unit and integration tests for new and legacy code. Take initiative and work independently on some projects while contributing to a large team on others. Provide second-tier production support for 24/7 applications. Follow team guidelines for quality and consistency within the design and development phases of the application. Identify opportunities to improve and optimize the application. Knowledge and Experience Bachelors degree in computer science or information technology. 4+ years of full stack development experience. In-depth knowledge of Java, JavaScript, CSS, HTML, and front-end languages. Knowledge of performance testing frameworks, Proven success with test-driven development Experience with browser-based debugging and performance testing software. Excellent troubleshooting skills. Good Object-oriented concepts and knowledge of core Java and Java EE. First-hand experience with enterprise messaging (IBM WebSphere MQ or equivalent) Practical knowledge of Java application servers (JBoss, Tomcat) preferred. Spring Framework working knowledge. Experience with the core AWS services Experience with the serverless approaches using AWS resources. Experience in developing infrastructure as code using CDK by efficient usage of AWS services. Experience in AWS services such as API Gateway, Lambda, DynamoDB, S3, Cognito and AWS CLI. Experience in using AWS SDK Understanding of distributed transactions Track record of completing assignments on time with a high degree of quality Experience and/or knowledge of all aspects of the SDLC methodology and related concepts and practices. Experience with Agile development methodologies preferred Knowledge of Gradle / Maven preferred Experience working with commodity markets or financial trading environments preferred Open to learn and willing to participate in development using new frameworks, programming languages. Good to Have Knowledge of REACT tools including React.js, TypeScript and JavaScript ES6, Webpack, Enzyme, Redux, and Flux. Experience with user interface design. experience in AWS Amplify, RDS, EventBridge, SNS, SQS and SES.

Posted Date not available

Apply

5.0 - 10.0 years

6 - 10 Lacs

hyderabad

Work from Office

Job Purpose Intercontinental Exchange, Inc. (ICE) presents a unique opportunity to work with cutting-edge technology and business challenges in the financial services sector.? ICE team members work across departments and traditional boundaries to innovate and respond to industry demand.? A successful candidate will be able to multitask in a dynamic team-based environment demonstrating strong problem-solving and decision-making abilities and the highest degree of professionalism. We are seeking an experienced AWS solution design engineer/architect to join our infrastructure cloud team. The infrastructure cloud team is responsible for internal services that provide developer collaboration tools, the build and release pipeline, and shared AWS cloud services platform. The infrastructure cloud team enables engineers to build product features and efficiently and confidently them into production. Responsibilities Develop utilities or furthering existing application and system management tools and processes that reduce manual efforts and increase overall efficiency Build and maintain Terraform/CloudFormation templates and scripts to automate and deploy AWS resources and configuration changes Experience reviewing and refining design and architecture documents presented by teams for operational readiness, fault tolerance and scalability Monitor and research cloud technologies and stay current with trends in the industry Participate in an on-call rotation and identify opportunities for reducing toil and avoiding technical debt to reduce support and operations load. Knowledge and Experience Essential 1.5+ years of experience in an DevOps, preferably DevSecOps, or SRE role in an AWS cloud environment. 1.5+ years strong experience with configuring, managing, solutioning, and architecting with AWS (Lambda, EC2, ECS, ELB, EventBridge, Kinesis, Route 53, SNS, SQS, CloudTrail, API Gateway, CloudFront, VPC, TransitGW, IAM, Security Hub, Service Mesh) Python, or Golang proficiency. Proven background of implementing continuous integration, and delivery for projects. A track record of introducing automation to solve administrative and other business as usual tasks. Beneficial Proficiency in Terraform, CloudFormation, or Ansible A history of delivering services developed in an API-first approach. Coming from a system administration, network, or security background. Prior experience working with environments of significant scale (thousands of servers)

Posted Date not available

Apply

7.0 - 12.0 years

30 - 35 Lacs

bengaluru

Work from Office

Solution Architect – Data Platforms & Solution Delivery Req number: R5967 Employment type: Full time Worksite flexibility: Remote Who we are CAI is a global technology services firm with over 8,500 associates worldwide and a yearly revenue of $1 billion+. We have over 40 years of excellence in uniting talent and technology to power the possible for our clients, colleagues, and communities. As a privately held company, we have the freedom and focus to do what is right—whatever it takes. Our tailor-made solutions create lasting results across the public and commercial sectors, and we are trailblazers in bringing neurodiversity to the enterprise. Job Summary We are seeking an experienced Solution Architect to drive the design and delivery of enterprise data and analytics solutions. You will work closely with business and technical teams to understand project requirements, develop conceptual and logical architectures, and oversee full lifecycle implementations— ensuring that solutions are scalable, secure, and aligned to business objectives. This role requires strong expertise in modern data platforms, cloud architectures, and end-to-end solution delivery in complex environments. This is a Full-time and Remote position. Job Description What You’ll Do Work with stakeholders to gather and analyze business, data, and technical requirements for new and evolving data solutions. Develop conceptual, logical, and physical solution architectures for data platforms, data products, analytics, and AI/ML projects. Guide projects through the entire solution lifecycle, from requirements and design to build, testing, and deployment. Integrate data from multiple sources (SAP and non-SAP), ensuring interoperability and seamless data flow. Define solution-level security, access controls, and compliance requirements in partnership with data governance and security teams. Lead efforts in performance optimization, cost management, and solution scalability. Produce clear technical documentation and solution architecture diagrams to support implementation and knowledge transfer. Stay up-to-date with advances in cloud, data engineering, and analytics technologies and recommend improvements. Collaborate across teams—including data engineers, data scientists, DevOps, and business stakeholders—to ensure effective delivery and adoption of solutions. What You'll Need Technical Proficiency: Solution Architecture & Data Platforms: Designing data platforms and architecture (conceptual, logical, physical). Cloud-native and hybrid data solutions (Databricks Lakehouse, AWS), Integration of SAP and non-SAP data sources. Databricks Lakehouse Platform: Medallion Architecture, Delta Lake & DLT Pipelines, PySpark Workbooks, Spark SQL & SQL Warehouse, Unity Catalog (data governance, lineage), Genie (query performance, indexing), Security & Role-Based Access Control. Programming: Python, SQL, PySpark, Spark, Scala. AWS Cloud Services: IAM, S3, Lambda, EMR, Redshift , Bedrock. Solution Delivery: Familiarity with DevOps and CI/CD processes, Solution documentation and architecture diagramming, Performance tuning and cost optimization, Experience with data modeling (ER, dimensional), Knowledge of data security and compliance frameworks. Strong communication, presentation, and stakeholder management skills. Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, or a related field. Certifications in Databricks, AWS, or Solution Architecture Experience with SAP ERP, S/4HANA, DataSphere, ABAP, and CDS views Exposure to data mesh or data product management concepts Background in manufacturing or enterprise analytics environments Physical Demands This role involves mostly sedentary work, with occasional movement around the office to attend meetings, etc. Ability to perform repetitive tasks on a computer, using a mouse, keyboard, and monitor. Reasonable accommodation statement If you require a reasonable accommodation in completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employment selection process, please direct your inquiries to application.accommodations@cai.io or (888) 824 – 8111.

Posted Date not available

Apply

1.0 - 6.0 years

3 - 6 Lacs

kolhapur

Work from Office

We are looking for a skilled AWS Developer with 15 hours of experience to join our team at Ecobillz Private Limited. The ideal candidate will have a strong background in software product development and proficiency in AWS technologies. Roles and Responsibility Design, develop, and deploy scalable and efficient software products using AWS services. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain high-quality code that meets industry standards and best practices. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with the latest trends and technologies in AWS and software product development. Job Requirements Strong understanding of software product development principles and methodologies. Proficiency in AWS services such as EC2, S3, Lambda, and DynamoDB. Experience with agile development methodologies and version control systems like Git. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills. Ability to work in a fast-paced environment and adapt to changing priorities.

Posted Date not available

Apply

15.0 - 25.0 years

10 - 14 Lacs

gurugram

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Core Infrastructure Good to have skills : NAMinimum 15 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for overseeing the application development process and ensuring successful project delivery. Roles & Responsibilities:- Expected to be a SME with deep knowledge and experience.- Should have Influencing and Advisory skills.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Lead the application development team in designing and implementing software solutions.- Collaborate with stakeholders to gather requirements and define project scope.- Provide technical guidance and mentorship to team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Core Infrastructure.- Strong understanding of cloud computing principles and best practices.- Experience in designing and implementing scalable and secure AWS solutions.- Hands-on experience with AWS services such as EC2, S3, RDS, and Lambda.- Knowledge of infrastructure as code tools like Terraform or CloudFormation. Additional Information:- The candidate should have a minimum of 15 years of experience in AWS Core Infrastructure.- This position is based at our Gurugram office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies