Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
9 - 13 Lacs
Chennai
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with other big data technologies such as Hadoop, Spark, and Kafka.- Strong understanding of data modeling and database design principles.- Experience with data security and access controls.- Experience with data pipeline development and maintenance.- Experience with troubleshooting and resolving issues related to data platform components. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.- This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education
Posted 2 weeks ago
7.0 - 12.0 years
9 - 13 Lacs
Chennai
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Engineering graduate preferably Computer Science graduate 15 years of full time education Summary :As a Data Platform Engineer, you will be responsible for assisting with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities:- Assist with the blueprint and design of the data platform components using Databricks Unified Data Analytics Platform.- Collaborate with Integration Architects and Data Architects to ensure cohesive integration between systems and data models.- Develop and maintain data pipelines using Databricks Unified Data Analytics Platform.- Design and implement data security and access controls using Databricks Unified Data Analytics Platform.- Troubleshoot and resolve issues related to data platform components using Databricks Unified Data Analytics Platform. Professional & Technical Skills: - Must To Have Skills: Experience with Databricks Unified Data Analytics Platform.- Must To Have Skills: Strong understanding of data platform components and architecture.- Good To Have Skills: Experience with cloud-based data platforms such as AWS or Azure.- Good To Have Skills: Experience with data security and access controls.- Good To Have Skills: Experience with data pipeline development and maintenance. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- The ideal candidate will possess a strong educational background in computer science or a related field, along with a proven track record of delivering impactful data-driven solutions.-This position is based at our Bangalore, Hyderabad, Chennai and Pune Offices.- Mandatory office (RTO) for 2- 3 days and have to work on 2 shifts (Shift A- 10:00am to 8:00pm IST and Shift B - 12:30pm to 10:30 pm IST) Qualification Engineering graduate preferably Computer Science graduate 15 years of full time education
Posted 2 weeks ago
7.0 - 12.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the implementation of data platform solutions.- Conduct performance tuning and optimization of data platform components. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of cloud-based data platforms.- Experience in designing and implementing data pipelines.- Knowledge of data governance and security best practices. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Gurugram
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Kubernetes, Selenium Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will engage in problem-solving activities, contribute to key decisions, and manage the development process to deliver high-quality applications that align with business objectives. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Kubernetes, Selenium.- Strong understanding of container orchestration and management.- Experience with cloud platforms and services.- Familiarity with CI/CD pipelines and DevOps practices.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Kubernetes.- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : AWS Administration Good to have skills : Microsoft Azure Data ServicesMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will collaborate with teams to ensure seamless integration and functionality of applications. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to ensure application functionality.- Conduct code reviews and provide feedback for continuous improvement.- Troubleshoot and resolve technical issues in applications.- Stay updated with industry trends and best practices for application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Administration.- Good To Have Skills: Experience with Microsoft Azure Data Services.- Strong understanding of cloud computing principles and services.- Knowledge of infrastructure as code and automation tools.- Experience in deploying and managing applications on cloud platforms. Additional Information:- The candidate should have a minimum of 3 years of experience in AWS Administration.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 weeks ago
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP ABAP Cloud Good to have skills : SAP ABAP Development for HANAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and contribute to key decisions. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Stay updated on industry trends and best practices- Conduct regular performance evaluations Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP ABAP Cloud- Good To Have Skills: Experience with SAP ABAP Development for HANA- Strong understanding of SAP ABAP Cloud architecture- Experience in designing and implementing SAP ABAP Cloud solutions- Knowledge of SAP ABAP Development for HANA- Ability to troubleshoot and resolve technical issues efficiently Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP ABAP Cloud- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education
Posted 2 weeks ago
5.0 - 10.0 years
13 - 17 Lacs
Pune
Work from Office
Project Role : Security Architect Project Role Description : Define the security architecture, ensuring that it meets the business requirements and performance goals. Must have skills : Security Platform Engineering Good to have skills : Java Enterprise Edition, Amazon Web Services (AWS), Infrastructure As Code (IaC)Minimum 5 year(s) of experience is required Educational Qualification : BE or BTech Degree in Computer Science:As a Security Architect, you will be solving deep technical problems and building creative solutions in a dynamic environment working with knowledgeable and passionate SDEs. You are experienced building for the cloud designing for five 9s, globally distributed all active deployments, horizontal scalability, fault tolerance, and more. You are motivated by learning, evaluating, and deploying new technologies. Our services are deployed in an Amazon Web Services environment, and so you will be working hands on with many AWS components. You thrive in true agile, highly paced, production facing environment. You have a low tolerance for mediocrity. You love to write code and build extraordinary things.We are looking for coders, people who love to code, just like we do. You should be energetic, confident, and ready to contribute in many areas of the software development lifecycle. You may be involved in all the aspects from research, design, specs, coding, and bug fixing. Our team focus is on writing dependable code and getting high quality products and services to market as quick as possible. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and implement security policies and procedures.- Conduct security assessments and audits.- Stay updated on the latest security trends and technologies. Professional & Technical Skills: - Experience with CICD pipelines (i.e. Jenkins) with building and/or configuring pipelines for build and deployment of software.- Coding knowledge and experience with SQL- Experience of writing scripts in Python, PowerShell or Bash- Experience in building highly-available (HA) production-grade solutions in AWS.- CICV, CICD, Automation (Terraform a plus)- Experience designing/implementing high performance Web services using SOA/REST/Microservices- Experience in the design/build/maintenance/refactor of large scale low latency high performance systems- Ability to quickly learn and develop expertise in existing highly complex applications and architectures- Extensive knowledge with high volume distributed application development in cloud environment- Strong troubleshooting and debugging skills, particularly in both production and non-production environments.- Experience using Agile methodologies, TDD, Code review, clear and concise documentation- Strong analytic, problem solving, and troubleshooting skills- Uncommon ability and motivation to tackle problems and learn fast- Ability to perform at a high level within a technical team- Ability to work independently with minimal supervision- Excellent communication and relationship skills- Distributed teamwork Additional Information:- The candidate should have a minimum of 5 years of experience in Security Platform Engineering.- Minimum 3 years experience building AWS cloud native services using EC2, S3, ECS, SQS, API Gateway, Lambda, etc.- Minimum 5 years coding knowledge and experience with java and/or C++ and object oriented methodologies- Minimum 1 year experience with CICD pipelines (i.e. Jenkins) with building and/or configuring pipelines for build and deployment of software.- This position is based at our Pune office.- A BE or BTech Degree in Computer Science or related technical field or equivalent practical knowledge is required. Qualification BE or BTech Degree in Computer Science
Posted 2 weeks ago
5.0 - 10.0 years
5 - 9 Lacs
Coimbatore
Work from Office
Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : AWS Architecture Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : should be a graduate and AWS certified Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. You will play a crucial role in ensuring that the applications are designed to meet the needs of the organization and its stakeholders. Your typical day will involve collaborating with various teams, analyzing requirements, and designing innovative solutions to address business challenges. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Collaborate with stakeholders to gather requirements and understand business processes.- Design and develop applications that meet the business process and application requirements.- Ensure the applications are scalable, secure, and efficient.- Conduct code reviews and provide guidance to the development team.- Stay updated with the latest industry trends and technologies.- Assist in troubleshooting and resolving application issues.- Document application designs, processes, and procedures.- Train and mentor junior team members to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Architecture.- Good To Have Skills: Experience with cloud platforms such as Azure or Google Cloud.- Strong understanding of cloud computing concepts and architecture.- Experience in designing and implementing scalable and secure cloud solutions.- Knowledge of AWS services such as EC2, S3, Lambda, RDS, and DynamoDB.- Familiarity with infrastructure as code tools like CloudFormation or Terraform.- Experience in designing and implementing CI/CD pipelines.- Excellent problem-solving and analytical skills. Additional Information:- The candidate should have a minimum of 5 years of experience in AWS Architecture.- This position is based at our Coimbatore office.- A graduate degree is required and AWS certification is preferred. Qualification should be a graduate and AWS certified
Posted 2 weeks ago
8.0 - 9.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 8 to 9+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 2 weeks ago
3.0 - 7.0 years
0 - 0 Lacs
Hyderabad
Work from Office
Experience Required: 3+ years Technical knowledge: AWS, Python, SQL, S3, EC2, Glue, Athena, Lambda, DynamoDB, RedShift, Step Functions, Cloud Formation, CI/CD Pipelines, Github, EMR, RDS,AWS Lake Formation, GitLab, Jenkins and AWS CodePipeline. Role Summary: As a Senior Data Engineer,with over 3 years of expertise in Python, PySpark, SQL to design, develop and optimize complex data pipelines, support data modeling, and contribute to the architecture that supports big data processing and analytics to cutting-edge cloud solutions that drive business growth. You will lead the design and implementation of scalable, high-performance data solutions on AWS and mentor junior team members.This role demands a deep understanding of AWS services, big data tools, and complex architectures to support large-scale data processing and advanced analytics. Key Responsibilities: Design and develop robust, scalable data pipelines using AWS services, Python, PySpark, and SQL that integrate seamlessly with the broader data and product ecosystem. Lead the migration of legacy data warehouses and data marts to AWS cloud-based data lake and data warehouse solutions. Optimize data processing and storage for performance and cost. Implement data security and compliance best practices, in collaboration with the IT security team. Build flexible and scalable systems to handle the growing demands of real-time analytics and big data processing. Work closely with data scientists and analysts to support their data needs and assist in building complex queries and data analysis pipelines. Collaborate with cross-functional teams to understand their data needs and translate them into technical requirements. Continuously evaluate new technologies and AWS services to enhance data capabilities and performance. Create and maintain comprehensive documentation of data pipelines, architectures, and workflows. Participate in code reviews and ensure that all solutions are aligned to pre-defined architectural specifications. Present findings to executive leadership and recommend data-driven strategies for business growth. Communicate effectively with different levels of management to gather use cases/requirements and provide designs that cater to those stakeholders. Handle clients in multiple industries at the same time, balancing their unique needs. Provide mentoring and guidance to junior data engineers and team members. Requirements: 3+ years of experience in a data engineering role with a strong focus on AWS, Python, PySpark, Hive, and SQL. Proven experience in designing and delivering large-scale data warehousing and data processing solutions. Lead the design and implementation of complex, scalable data pipelines using AWS services such as S3, EC2, EMR, RDS, Redshift, Glue, Lambda, Athena, and AWS Lake Formation. Bachelor's or Masters degree in Computer Science, Engineering, or a related technical field. Deep knowledge of big data technologies and ETL tools, such as Apache Spark, PySpark, Hadoop, Kafka, and Spark Streaming. Implement data architecture patterns, including event-driven pipelines, Lambda architectures, and data lakes. Incorporate modern tools like Databricks, Airflow, and Terraform for orchestration and infrastructure as code. Implement CI/CD using GitLab, Jenkins, and AWS CodePipeline. Ensure data security, governance, and compliance by leveraging tools such as IAM, KMS, and AWS CloudTrail. Mentor junior engineers, fostering a culture of continuous learning and improvement. Excellent problem-solving and analytical skills, with a strategic mindset. Strong communication and leadership skills, with the ability to influence stakeholders at all levels. Ability to work independently as well as part of a team in a fast-paced environment. Advanced data visualization skills and the ability to present complex data in a clear and concise manner. Excellent communication skills, both written and verbal, to collaborate effectively across teams and levels. Preferred Skills: Experience with Databricks, Snowflake, and machine learning pipelines. Exposure to real-time data streaming technologies and architectures. Familiarity with containerization and serverless computing (Docker, Kubernetes, AWS Lambda).
Posted 2 weeks ago
3.0 - 6.0 years
6 - 7 Lacs
Bengaluru
Work from Office
Oracle EBS with Oracle EBS Functional Manufacturing Profound understanding of Oracle EBS Functional Manufacturing for optimizing production and supply chain processes.
Posted 2 weeks ago
7.0 - 12.0 years
20 - 25 Lacs
Pune
Work from Office
Assessment & Analysis Review CAST software intelligence reports to identify technical debt, architectural flaws, and cloud readiness. Conduct manual assessments of applications to validate findings and prioritize migration efforts. Identify refactoring needs (e.g., monolithic to microservices, serverless adoption). Evaluate legacy systems (e.g., .NET Framework, Java EE) for compatibility with AWS services. Solution Design Develop migration strategies (rehost, replatform, refactor, retire) for each application. Architect AWS-native solutions using services like EC2, Lambda, RDS, S3, and EKS. Design modernization plans for legacy systems (e.g., .NET Framework .NET Core, Java EE Spring Boot). Ensure compliance with AWS Well-Architected Framework (security, reliability, performance, cost optimization). Collaboration & Leadership Work with cross-functional teams (developers, DevOps, security) to validate designs. Partner with clients to align technical solutions with business objectives. Mentor junior architects and engineers on AWS best practices. Job Title: Senior Solution Architect - Cloud Migration & Modernization (AWS) Location: [Insert Location] Department: Digital Services Reports To: Cloud SL
Posted 2 weeks ago
4.0 - 7.0 years
14 - 18 Lacs
Gurugram
Work from Office
Job Overview We are looking for a high end Principal Engineer who will play a role of technical lead along with hands-on contributor. This role demands someone with deep expertise in Python and its frameworks, strong exposure to cloud-native development (AWS, Kubernetes), and a proven track record in driving engineering excellence across multiple domains such as backend services, infrastructure, and DevOps practices. You would be responsible for designing and building scalable backend services and APIs using Python frameworks, integrating relational databases, and deploying in containerized environments on prem and cloud. Key Responsibilities . Lead technical architecture and design for scalable, resilient, and secure systems. . Design and develop RESTful backend APIs using FastAPI and Flask .Build server-side logic and business functionalities using Python .Design and integrate MySQL and PostgreSQL databases .Deploy and manage applications in Docker/Kubernetes environments .Maintain CI/CD pipelines using Git and Jenkins .Collaborate with DevOps and frontend teams for integration .Ensure code quality through peer reviews and documentation Technical Stack .Python (FastAPI, Flask) .RESTful API .MySQL, PostgreSQL .Docker, Kubernetes .Git, Jenkins .Linux-based development environment . Strong experience working with AWS services (e.g., EC2, Lambda, EKS, S3, RDS, CloudWatch, DynamoDB). Nice to Have: -Experience with event-driven or microservices architecture. -Exposure to serverless computing and IaC tools like Terraform or AWS CDK. -Familiarity with security and compliance in cloud-native applications.
Posted 2 weeks ago
12.0 - 15.0 years
20 - 25 Lacs
Pune, Bengaluru, Hinjewadi
Work from Office
job requisition idJR1027350 Job Summary Synechron is seeking a experienced and strategic Delivery Lead to oversee complex technology projects utilizing .NET, C#, and AWS. This role is instrumental in managing end-to-end project delivery, guiding cross-functional teams, and ensuring alignment with business objectives. The Delivery Lead will drive improvements in delivery efficiency, quality, and stakeholder satisfaction, contributing significantly to the organizations technological growth and operational excellence. Software Required Skills: Development and delivery experience with .NET (preferably version 4.7 or later) C# programming proficiency Hands-on experience with Amazon Web Services (AWS) (EC2, S3, Lambda, etc.) Project management tools Jira , SharePoint , MS Excel , PowerPoint , Power BI Agile and Waterfall project management methodologies Preferred Skills: Experience with DevOps/CI-CD pipelines Knowledge of Azure cloud platform Familiarity with software release management Overall Responsibilities End-to-End Project Delivery: Manage multiple projects from initiation to closure, ensuring delivery is on time, within scope, and within budget. Team Leadership: Lead and mentor diverse project teams, fostering collaboration and high performance. Stakeholder Management: Act as the primary point of contact for clients and internal stakeholders, translating business needs into technical solutions. Governance & Compliance: Ensure adherence to organizational policies, standards, and industry best practices. Technical Oversight: Provide guidance on architecture, technology choices, and solution design aligned with best practices. Process Optimization: Continuously identify opportunities to improve delivery processes, increase efficiency, and reduce risk. Financial Oversight: Monitor project budgets, optimize resource utilization, and report on financial performance. Risk & Issue Management: Identify, assess, and mitigate risks impacting project delivery. Performance Measurement: Establish metrics and KPIs to measure project success and customer satisfaction. Technical Skills (By Category) Programming Languages: Essential: C#, .NET Framework/Core Preferred: Java, Python (for integration or automation) Databases/Data Management: SQL Server, AWS RDS Cloud Technologies: AWS cloud services (EC2, S3, Lambda, CloudWatch) Frameworks & Libraries: .NET Core / .NET Framework RESTful APIs, Microservices architecture Development Tools & Methodologies: Agile, Scrum, Kanban DevOps tools (Jenkins, Azure DevOps, Git) Security Protocols: AWS security best practices Data privacy and compliance standards Experience 12 to 15 years of professional experience in managing IT projects and delivery teams. Demonstrable experience leading large-scale software development and implementation projects. Strong background with .NET/C# development, AWS cloud solutions, and cross-functional team management. Experience managing global or distributed teams. Proven stakeholder management experience with senior management and clients. Prior exposure to Agile and Waterfall project methodologies. Alternative Experience: Candidates with extensive experience in software delivery, cloud migration, or enterprise application implementation may be considered. Day-to-Day Activities Conduct project planning, resource allocation, and status reporting. Hold regular stand-ups, progress reviews, and stakeholder meetings. Review development progress, remove blockers, and ensure adherence to quality standards. Collaborate with technical teams on architecture design and problem resolution. Manage change requests, scope adjustments, and project adjustments. Track project KPIs, update dashboards, and communicate progress to leadership. Oversee risk registers and implement mitigation strategies. Facilitate retrospectives and process improvement initiatives. Qualifications Bachelors degree in Computer Science, Engineering, or related field; Masters preferred. Project Management certifications such as PMP, PMI-ACP, or ScrumMaster are advantageous. Training or certification in AWS or cloud architecture is preferred. Commitment to continuous learning and professional development. Professional Competencies Strong analytical and problem-solving skills Effective leadership and team management capabilities Excellent stakeholder communication and negotiation skills Ability to adapt to evolving project requirements and technologies Strategic thinking and organizational agility Data-driven decision-making Prioritization and time management skills Change management and process improvement orientation
Posted 2 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Chennai, Bengaluru
Work from Office
job requisition idJR1027452 Overall Responsibilities: Data Pipeline Development: Design, develop, and maintain highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform, ensuring data integrity and accuracy. Data Ingestion: Implement and manage data ingestion processes from a variety of sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP. Data Transformation and Processing: Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs and business requirements. Performance Optimization: Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes. Data Quality and Validation: Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline. Automation and Orchestration: Automate data workflows using tools like Apache Oozie, Airflow, or similar orchestration tools within the Cloudera ecosystem. Monitoring and Maintenance: Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and associated data processes. Collaboration: Work closely with other data engineers, analysts, product managers, and other stakeholders to understand data requirements and support various data-driven initiatives. Documentation: Maintain thorough documentation of data engineering processes, code, and pipeline configurations. Software : Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Familiarity with Hadoop, Kafka, and other distributed computing tools. Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Strong scripting skills in Linux. Category-wise Technical Skills: PySpark: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Cloudera Data Platform: Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Data Warehousing: Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Big Data Technologies: Familiarity with Hadoop, Kafka, and other distributed computing tools. Orchestration and Scheduling: Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Scripting and Automation: Strong scripting skills in Linux. Experience: 5-12 years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform. Proven track record of implementing data engineering best practices. Experience in data ingestion, transformation, and optimization on the Cloudera Data Platform. Day-to-Day Activities: Design, develop, and maintain ETL pipelines using PySpark on CDP. Implement and manage data ingestion processes from various sources. Process, cleanse, and transform large datasets using PySpark. Conduct performance tuning and optimization of ETL processes. Implement data quality checks and validation routines. Automate data workflows using orchestration tools. Monitor pipeline performance and troubleshoot issues. Collaborate with team members to understand data requirements. Maintain documentation of data engineering processes and configurations. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. Relevant certifications in PySpark and Cloudera technologies are a plus. Soft Skills: Strong analytical and problem-solving skills. Excellent verbal and written communication abilities. Ability to work independently and collaboratively in a team environment. Attention to detail and commitment to data quality.
Posted 2 weeks ago
8.0 - 13.0 years
18 - 25 Lacs
Pune
Work from Office
job requisition idJR1027352 Job Summary Synechron is seeking an analytical and innovative Senior Data Scientist to support and advance our data-driven initiatives. The ideal candidate will have a solid understanding of data science principles, hands-on experience with AI/ML tools and techniques, and the ability to interpret complex data sets to deliver actionable insights. This role contributes to the organizations strategic decision-making and technology innovation by applying advanced analytics and machine learning models in a collaborative environment. Software Required Skills: Python (including libraries such as pandas, scikit-learn, TensorFlow, PyTorch) proficiency in developing and deploying models R (optional, but preferred) Data management tools (SQL, NoSQL databases) Cloud platforms (preferably AWS or Azure) for data storage and ML deployment Jupyter Notebooks or similar interactive development environments Version control tools such as Git Preferred Skills: Big data technologies (Spark, Hadoop) Model deployment tools (MLflow, Docker, Kubernetes) Data visualization tools (Tableau, Power BI) Overall Responsibilities Analyze and interpret large and complex data sets to generate insights for business and technology initiatives. Assist in designing, developing, and implementing AI/ML models and algorithms to solve real-world problems. Collaborate with cross-functional teams including data engineers, software developers, and business analysts to integrate models into production systems. Stay current with emerging trends, research, and best practices in AI/ML/Data Science and apply them to ongoing projects. Document methodologies, modeling approaches, and insights clearly for technical and non-technical stakeholders. Support model validation, testing, and performance monitoring to ensure accuracy and reliability. Contribute to the development of data science workflows and standards within the organization. Performance Outcomes: Accurate and reliable data models that support strategic decision-making. Clear documentation and communication of findings and recommendations. Effective collaboration with technical teams to deploy scalable models. Continuous adoption of best practices in AI/ML and data management. Technical Skills (By Category) Programming Languages: Essential: Python (best practices in ML development), SQL Preferred: R, Java (for integration purposes) Databases/Data Management: SQL databases, NoSQL (MongoDB, Cassandra) Cloud data storage solutions (AWS S3, Azure Blob Storage) Cloud Technologies: AWS (S3, EC2, SageMaker, Lambda) Azure Machine Learning (preferred) Frameworks & Libraries: TensorFlow, PyTorch, scikit-learn, Keras, XGBoost Development Tools & Methodologies: Jupyter Notebooks, Git, CI/CD pipelines Agile and Scrum processes Security Protocols: Best practices in data security and privacy, GDPR compliance Experience 8+ years of professional experience in AI, ML, or Data Science roles. Proven hands-on experience designing and deploying ML models in real-world scenarios. Demonstrated ability to analyze complex data sets and translate findings into business insights. Previous experience working with cloud-based data science solutions is preferred. Strong portfolio showcasing data science projects, models developed, and practical impact. Alternative Pathways: Candidates with extensive research or academic experience in AI/ML can be considered, provided they demonstrate practical application of skills. Day-to-Day Activities Conduct data exploration, cleaning, feature engineering, and model development. Collaborate with data engineers to prepare data pipelines for model training. Build, validate, and refine machine learning models. Present insights, models, and recommendations to technical and business stakeholders. Support deployment of models into production environments. Monitor model performance and iterate to improve effectiveness. Participate in team meetings, project planning, and reviewing progress. Document methodologies and maintain version control of codebase. Qualifications Bachelors degree in Computer Science, Mathematics, Statistics, Data Science, or a related field; Masters or PhD highly desirable. Evidence of relevant coursework, certifications, or professional training in AI/ML. Professional certifications (e.g., AWS Certified Machine Learning Specialty, Microsoft Certified Data Scientist) are a plus. Commitment to ongoing professional development in AI/ML methodologies. Professional Competencies Strong analytical and critical thinking to solve complex problems. Effective communication skills for technical and non-technical audiences. Demonstrated ability to work collaboratively in diverse teams. Aptitude for learning new tools, techniques, and technologies rapidly. Innovation mindset with a focus on applying emerging research. Strong organizational skills to manage multiple projects and priorities.
Posted 2 weeks ago
3.0 - 7.0 years
5 - 9 Lacs
Chennai
Work from Office
Overview We are looking for a Full-stack Developer and Automation Engineer with knowledge in Cloud, DevOps Tools, Automation, excellent analytical, problem solving and communication skills. You'll need to have Bachelor’s degree or two or more years of work experience Experience working with Front-end and Back-end Technologies for building, enhancing and managing applications Experience working with Backend technologies like Python, DJango, Java, ReactJS, NodeJS, Springboot Experience working with Client-side scripting technologies like JavaScript, JQuery, etc. Experience in advanced SQL/procedures on MySQL/MongoDB/MariaDB/Oracle Experience using AWS Cloud Infrastructure services such as EC2, ALB, RDS, etc. Experience working with serverless technologies like AWS Lambda, Google/Azure Functions Knowledge of SDLC with Devops tools and Agile Development Even Better if you have Experience in Monitoring/Alerting tools and platforms such as Prometheus, Grafana, Catchpoint, New Relic, etc Experience agile practices and tools used in the development (Jira, Confluence, Jenkins, etc.) Experience in code review, quality, performance tuning with problem solving and debugging skills. Experience with Unit testing framework like JUnit, Mokito. Good communication, interpersonal skills and communication skills to clearly articulate and influence stakeholders. Very good problem solving skills. Responsibilities We are looking for a Full-stack Developer and Automation Engineer with knowledge in Cloud, DevOps Tools, Automation, excellent analytical, problem solving and communication skills. You'll need to have Bachelor’s degree or two or more years of work experience Experience working with Front-end and Back-end Technologies for building, enhancing and managing applications Experience working with Backend technologies like Python, DJango, Java, ReactJS, NodeJS, Springboot Experience working with Client-side scripting technologies like JavaScript, JQuery, etc. Experience in advanced SQL/procedures on MySQL/MongoDB/MariaDB/Oracle Experience using AWS Cloud Infrastructure services such as EC2, ALB, RDS, etc. Experience working with serverless technologies like AWS Lambda, Google/Azure Functions Knowledge of SDLC with Devops tools and Agile Development Even Better if you have Experience in Monitoring/Alerting tools and platforms such as Prometheus, Grafana, Catchpoint, New Relic, etc Experience agile practices and tools used in the development (Jira, Confluence, Jenkins, etc.) Experience in code review, quality, performance tuning with problem solving and debugging skills. Experience with Unit testing framework like JUnit, Mokito. Good communication, interpersonal skills and communication skills to clearly articulate and influence stakeholders. Very good problem solving skills. We are looking for a Full-stack Developer and Automation Engineer with knowledge in Cloud, DevOps Tools, Automation, excellent analytical, problem solving and communication skills. You'll need to have Bachelor’s degree or two or more years of work experience Experience working with Front-end and Back-end Technologies for building, enhancing and managing applications Experience working with Backend technologies like Python, DJango, Java, ReactJS, NodeJS, Springboot Experience working with Client-side scripting technologies like JavaScript, JQuery, etc. Experience in advanced SQL/procedures on MySQL/MongoDB/MariaDB/Oracle Experience using AWS Cloud Infrastructure services such as EC2, ALB, RDS, etc. Experience working with serverless technologies like AWS Lambda, Google/Azure Functions Knowledge of SDLC with Devops tools and Agile Development Even Better if you have Experience in Monitoring/Alerting tools and platforms such as Prometheus, Grafana, Catchpoint, New Relic, etc Experience agile practices and tools used in the development (Jira, Confluence, Jenkins, etc.) Experience in code review, quality, performance tuning with problem solving and debugging skills. Experience with Unit testing framework like JUnit, Mokito. Good communication, interpersonal skills and communication skills to clearly articulate and influence stakeholders. Very good problem solving skills.
Posted 2 weeks ago
5.0 - 10.0 years
13 - 18 Lacs
Gurugram
Work from Office
Position Summary To be a technology expert architecting solutions and mentoring people in BI / Reporting processes with prior expertise in the Pharma domain. Job Responsibilities o Technology Leadership – Lead guide the team independently or with little support to design, implement deliver complex reporting and BI project assignments. o Technical portfolio – Expertise in a range of BI and hosting technologies like the AWS stack (Redshift, EC2), Qlikview, QlikSense, Tableau, Microstrategy, Spotfire o Project Management – Get accurate briefs from the Client and translate into tasks for team members with priorities and timeline plans. Must maintain high standards of quality and thoroughness. Should be able to monitor accuracy and quality of others' work. Ability to think in advance about potential risks and mitigation plans. o Logical Thinking – Able to think analytically, use a systematic and logical approach to analyze data, problems, and situations. Must be able to guide team members in analysis. o Handle Client Relationship – Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience - Minimum of 5 years of relevant experience in Pharma domain. - Technical: Should have 10+ years of hands on experience in the following tools: Must have working knowledge of toolsAtleast 2 of the following – Qlikview, QlikSense, Tableau, Microstrategy, Spotfire/ (Informatica, SSIS, Talend & metallion)/ Big Data technologies - Hadoop ecosystem. Aware of techniques such asUI design, Report modeling, performance tuning and regression testing Basic expertise with MS excel Advanced expertise with SQL - Functional: Should have experience in following concepts and technologies: Specifics: Pharma data sources like IMS, Veeva, Symphony, Cegedim etc. Business processes like alignment, market definition, segmentation, sales crediting, activity metrics calculation Calculation of all sales, activity and managed care KPIs Behavioural Competencies Teamwork & Leadership Motivation to Learn and Grow Ownership Cultural Fit Talent Management Technical Competencies Problem Solving Lifescience Knowledge Communication Project Management Attention to P&L Impact Capability Building / Thought Leadership Scale of revenues managed / delivered
Posted 2 weeks ago
5.0 - 10.0 years
30 - 35 Lacs
Noida
Work from Office
Position Summary This position is part of the technical leadership in data warehousing and Business Intelligence areas. Someone who can work on multiple project streams and clients for better business decision making especially in the area of Lifesciences/ Pharmaceutical domain. Job Responsibilities o Technology Leadership – Lead guide the team independently or with little support to design, implement deliver complex cloud data management and BI project assignments. o Technical portfolio – Expertise in a range of BI and data hosting technologies like the AWS stack (Redshift, EC2), Snowflake, Spark, Full Stack, Qlik, Tableau, Microstrategy o Project Management – Get accurate briefs from the Client and translate into tasks for team members with priorities and timeline plans. Must maintain high standards of quality and thoroughness. Should be able to monitor accuracy and quality of others' work. Ability to think in advance about potential risks and mitigation plans. o Logical Thinking – Able to think analytically, use a systematic and logical approach to analyze data, problems, and situations. Must be able to guide team members in analysis. o Handle Client Relationship, P&L – Manage client relationship and client expectations independently. Should be able to deliver results back to the Client independently. Should have excellent communication skills. Education BE/B.Tech Master of Computer Application Work Experience Minimum of 5 years of relevant experience in Pharma domain. TechnicalShould have 15 years of hands on experience in the following tools Must have working knowledge of toolsAtleast 2 of the following – Qlikview, QlikSense, Tableau, Microstrategy, Spotfire Aware of techniques such asUI design, Report modeling, performance tuning and regression testing Basic expertise with MS excel Advanced expertise with SQL FunctionalShould have experience in following concepts and technologies Specifics Pharma data sources like IMS, Veeva, Symphony, Cegedim etc. Business processes like alignment, market definition, segmentation, sales crediting, activity metrics calculation 0-2 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company1-3 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company3-5 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company3-5 years of relevant experience in a large/midsize IT services/Consulting/Analytics Company Behavioural Competencies Project Management Communication Attention to P&L Impact Teamwork & Leadership Motivation to Learn and Grow Lifescience Knowledge Ownership Cultural Fit Scale of resources managed Scale of revenues managed / delivered Problem solving Talent Management Capability Building / Thought Leadership Technical Competencies AWS KnowHow Formal Industry Certification AWS Certified Cloud Practitioner Snowflake Data Engineering Data Governance Data Modelling Data Operations (Service Management) Data Warehousing & Data Lake Databricks Dataiku Formal Industry Certification Informatica_Cloud Data Warehouse & Data Lake Modernization Master Data Management Patient Data Analytics Know How Pharma Commercial Data - US Pharma Commercial Data - EU
Posted 2 weeks ago
4.0 - 6.0 years
6 - 8 Lacs
Hyderabad
Work from Office
What you will do In this vital role you will be responsible for designing, building, and maintaining scalable, secure, and reliable AWS cloud infrastructure. This is a hands-on engineering role requiring deep expertise in Infrastructure as Code (IaC), automation, cloud networking, and security . The ideal candidate should have strong AWS knowledge and be capable of writing and maintaining Terraform, CloudFormation, and CI/CD pipelines to streamline cloud deployments. Please note, this is an onsite role based in Hyderabad. Roles & Responsibilities: AWS Infrastructure Design & Implementation Architect, implement, and manage highly available AWS cloud environments . Design VPCs, Subnets, Security Groups, and IAM policies to enforce security standard processes. Optimize AWS costs using reserved instances, savings plans, and auto-scaling . Infrastructure as Code (IaC) & Automation Develop, maintain, and enhance Terraform & CloudFormation templates for cloud provisioning. Automate deployment, scaling, and monitoring using AWS-native tools & scripting. Implement and manage CI/CD pipelines for infrastructure and application deployments. Cloud Security & Compliance Enforce standard methodologies in IAM, encryption, and network security. Ensure compliance with SOC2, ISO27001, and NIST standards. Implement AWS Security Hub, GuardDuty, and WAF for threat detection and response. Monitoring & Performance Optimization Set up AWS CloudWatch, Prometheus, Grafana, and logging solutions for proactive monitoring. Implement autoscaling, load balancing, and caching strategies for performance optimization. Solve cloud infrastructure issues and conduct root cause analysis. Collaboration & DevOps Practices Work closely with software engineers, SREs, and DevOps teams to support deployments. Maintain GitOps standard methodologies for cloud infrastructure versioning. Support on-call rotation for high-priority cloud incidents. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Masters degree and 4 to 6 years of experience in computer science, IT, or related field hands-on cloud experience OR Bachelors degree and 6 to 8 years of experience in computer science, IT, or related field hands-on cloud experience OR Diploma and 10 to 12 years of experience in computer science, IT, or related field hands-on cloud experience Must-Have Skills: Deep hands-on experience with AWS (EC2, S3, RDS, Lambda, VPC, IAM, ECS/EKS, API Gateway, etc.) . Expertise in Terraform & CloudFormation for AWS infrastructure automation. Strong knowledge of AWS networking (VPC, Direct Connect, Transit Gateway, VPN, Route 53) . Experience with Linux administration, scripting (Python, Bash), and CI/CD tools (Jenkins, GitHub Actions, CodePipeline, etc.) . Strong troubleshooting and debugging skills in cloud networking, storage, and security . Preferred Qualifications: Good-to-Have Skills: Experience with Kubernetes (EKS) and service mesh architectures . Knowledge of AWS Lambda and event-driven architectures . Familiarity with AWS CDK, Ansible, or Packer for cloud automation. Exposure to multi-cloud environments (Azure, GCP) . Familiarity with HPC, DGX Cloud . Professional Certifications (preferred): AWS Certified Solutions Architect Associate or Professional AWS Certified DevOps Engineer Professional Terraform Associate Certification Soft Skills: Strong analytical and problem-solving skills. Ability to work effectively with global, virtual teams. Effective communication and collaboration with cross-functional teams. Ability to work in a fast-paced, cloud-first environment. Shift Information: This position is required to be onsite and participate in 24/5 and weekend on call in rotation fashion and may require you to work a later shift. Candidates must be willing and able to work off hours, as required based on business requirements.
Posted 2 weeks ago
6.0 - 8.0 years
7 - 11 Lacs
Bengaluru
Work from Office
What you will be doing - 30% • Manage and administer the AWS accounts for Nonprod and assist in prod environments • Automate the resources creation of various AWS rAesources using Terraform , Ansible & Jenkins • Manage the existing resources that were deployed by PCF • Create various automation scripts to monitor the AWS resources and send the email / slack notifications on any spike’s on the resource metrics • Ensure that the AWS Accounts including all the resources are in-compliance as per the Maximus standards • Configure and Monitor the Cloudtrail, Cloudwatch logs, VPC Logs , AWS config rules etc • Monitor the AWS resources for cost-optimization and recommend the right sizing of the resources • Implement IAM strategies to ensure least privilege access through IAM roles or service accounts • Participate in design/architecture meetings to document design decisions and to cross check with requirements 20% - work on automating the resource creation and monitoring using Terraform, Ansible and Lambda. 20% - support the existing infrastructure including performing the upgrades, security practices & monitoring the platform 15% - Learn the supporting tools including EKS, Apigee & PCF 10% - Support non-prod environments and assist in supporting Production environments and to be on call once every 7-10 weeks. 5% - Review and suggest cost optimization tasks on AWS infrastructure usage Roles and Responsibilities What we are looking for- • Bachelor's Degree from an accredited college or university required, equivalent experience considered in lieu of degree • Candidates will need to be Sr level (6-8 + years overall experience with at least 4-5 years of AWS experience) • Ability to work during the EST time zone and support Offshore customer teams as required • Extensive experience on provisioning and configuring various AWS Resources including Ec2, Load balancers, S3, KMS, route 53, Lambda , EKS etc • Extensive experience on implementing the security best practices on the AWS including IAM, Cloudtrail, Cloudwatch, AWS config etc • Good knowledge on one of the programming languages to write the Lambda service for automation (preferably Python) • Strong knowledge on networking within AWS including VPC, subnets, Internet Gateway, transit gateway, peering, routing etc • Strong knowledge on Terraform including workspaces, remote-state, Modules etc. • Strong knowledge on Linux Operating system including shell scripting • Experience working on configuring the AWS Elasticsearch (ELK) and configuring the Logstash • Good Understanding and experience with the container technologies (Kubernetes/Docker) • Good understanding and experience provisioning and monitoring RDS • Experience in monitoring tools such as Splunk, Nagios, Appdynamics, Dynatrace or related tools • Strong understanding of AWS cost explorer and cost analysis tools. • Should understand Microservice Architecture & Spring Boot framework. • Should be aware or know DevOps from a solutioning standpoint. • Should be able to independently evaluate products/services that would be deployed/integrated on/with the Platform • Should be willing to perform Production support work as On call Support • Understand and practice Agile development methodology • Willingness to do pair programming with other team members • Excellent organizational, interpersonal, verbal, and written communication skills • Ability to work as a team member, as well as independently
Posted 2 weeks ago
5.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Responsibilities: Design and implement the data modeling, data ingestion and data processing for various datasets Design, develop and maintain ETL Framework for various new data source Develop data ingestion using AWS Glue/ EMR, data pipeline using PySpark, Python and Databricks. Build orchestration workflow using Airflow & databricks Job workflow Develop and execute adhoc data ingestion to support business analytics. Proactively interact with vendors for any questions and report the status accordingly Explore and evaluate the tools/service to support business requirement Ability to learn to create a data-driven culture and impactful data strategies. Aptitude towards learning new technologies and solving complex problem. Qualifications: Minimum of bachelors degree. Preferably in Computer Science, Information system, Information technology. Minimum 5 years of experience on cloud platforms such as AWS, Azure, GCP. Minimum 5 year of experience in Amazon Web Services like VPC, S3, EC2, Redshift, RDS, EMR, Athena, IAM, Glue, DMS, Data pipeline & API, Lambda, etc. Minimum of 5 years of experience in ETL and data engineering using Python, AWS Glue, AWS EMR /PySpark and Airflow for orchestration. Minimum 2 years of experience in Databricks including unity catalog, data engineering Job workflow orchestration and dashboard generation based on business requirements Minimum 5 years of experience in SQL, Python, and source control such as Bitbucket, CICD for code deployment. Experience in PostgreSQL, SQL Server, MySQL & Oracle databases. Experience in MPP such as AWS Redshift, AWS EMR, Databricks SQL warehouse & compute cluster. Experience in distributed programming with Python, Unix Scripting, MPP, RDBMS databases for data integration Experience building distributed high-performance systems using Spark/PySpark, AWS Glue and developing applications for loading/streaming data into Databricks SQL warehouse & Redshift. Experience in Agile methodology Proven skills to write technical specifications for data extraction and good quality code. Experience with big data processing techniques using Sqoop, Spark, hive is additional plus Experience in data visualization tools including PowerBI, Tableau. Nice to have experience in UI using Python Flask framework anglular Mandatory Skills: Python for Insights. Experience: 5-8 Years.
Posted 2 weeks ago
7.0 - 9.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 7 to 9+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 2 weeks ago
6.0 - 10.0 years
15 - 25 Lacs
Gurugram
Work from Office
Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Kyndryl is seeking a talented AWS Solutions Expert. The ideal candidate will have hands-on experience in designing, building, and migrating complex workloads to AWS. In this role, you will be instrumental in developing comprehensive cloud strategies, creating scalable and secure AWS environments, and guiding clients through seamless migrations from on-premises infrastructure to the cloud and manage Cloud BAU operations. Your responsibilities will include developing and implementing scalable AWS architectures, focusing on modernizing VMware workloads and ensuring robust, secure, and cost-effective cloud environments. You will design and deploy AWS landing zones to establish a secure, multi-account AWS environment, adhering to best practices for compliance, security, and account management. Leading initiatives to modernize on-premises VMware environments by leveraging AWS services (e.g., VMware to EC2) and developing strategies for re-platforming and refactoring applications will be a key part of your role. Additionally, you will manage the planning and execution of cloud migration projects, including assessments, dependency mapping, migration strategy, and risk mitigation. Partnering with clients to understand their business requirements, define project scope, and ensure successful cloud transformation outcomes will be essential. Ensuring all designs and implementations comply with AWS best practices, security policies, and relevant industry standards is crucial. You will utilize Infrastructure-as-Code (IaC) tools, such as AWS CloudFormation, Terraform, and Ansible, to automate deployment and management processes. Optimizing cloud environments for cost, performance, and scalability while following AWS best practices and governance frameworks will be part of your responsibilities. Finally, you will produce and maintain detailed architectural documentation, operational procedures, and best practice guidelines, and provide training to internal teams and clients on cloud adoption and management. Your future at Kyndryl This role opens the door to many career paths, both vertical and horizontal, and there may be opportunity to travel. It’s a great chance for database administrators or other techs to break into the cloud. It’s also a solid path to become enterprise or architect or a distinguished engineer! Whatever you see for yourself, you’ll find the opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Technical and Professional Expertise Minimum of 8+ years of experience as a solutions expert or in a similar role, with at least 5+ years working specifically with AWS. Proficiency with AWS services (VPC, EC2, IAM, Lambda, S3, RDS, CloudTrail, etc.) and Infrastructure-as-Code (IaC) tools such as CloudFormation, Terraform, or Ansible. Knowledge of AWS security practices, compliance requirements, and tools (IAM, AWS Organizations, AWS Control Tower, and GuardDuty). Familiarity with CI/CD pipelines and automation frameworks to support continuous delivery and integration. Good understanding and hands-on in managing Dynamo DB / Kinesis / lambda / SQS / SNS / DevOps services. Hands on DevOps and CI/CD automation tools such as Git, Jenkins, Terraform, Ansible, and cloud native scripting such as CloudFormation and ARM templates Experience in supporting CDK Development using TypeScript Should have strong knowledge on Certificate management, Encryption keys management, Load Balancers etc. Should be able to draw Architecture diagrams, develop HLD, LLD, come up with other related documentation as required for the solution developed. Hands-on experience in VMware, including familiarity with VMware to EC2 migrations, VMware NSX, vSphere, and vSAN. Proven experience in designing and building AWS landing zones to support secure, multi-account architectures. Strong experience with cloud migration projects, including workload assessment, migration planning, and execution, with expertise in tools such as AWS Migration Hub and AWS Server Migration Service. Datadog integration setup / management hands on. Problem solving and conflict resolution skills. Preferred Technical and Professional Experience AWS Certified Solutions Architect (Associate or Professional) and/or VMware certification (e.g., VMware Certified Professional - VCP) preferred. Experience in hybrid cloud environments and multi-cloud strategies. Familiarity with serverless architecture and microservices design. Excellent communication and interpersonal skills, with the ability to convey complex technical concepts to both technical and non-technical stakeholders. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.
Posted 2 weeks ago
8.0 - 12.0 years
30 - 40 Lacs
Pune
Work from Office
Assessment & Analysis Review CAST software intelligence reports to identify technical debt, architectural flaws, and cloud readiness. Conduct manual assessments of applications to validate findings and prioritize migration efforts. Identify refactoring needs (e.g., monolithic to microservices, serverless adoption). Evaluate legacy systems (e.g., .NET Framework, Java EE) for compatibility with AWS services. Solution Design Develop migration strategies (rehost, replatform, refactor, retire) for each application. Architect AWS-native solutions using services like EC2, Lambda, RDS, S3, and EKS. Design modernization plans for legacy systems (e.g., .NET Framework .NET Core, Java EE Spring Boot). Ensure compliance with AWS Well-Architected Framework (security, reliability, performance, cost optimization). Collaboration & Leadership Work with cross-functional teams (developers, DevOps, security) to validate designs. Partner with clients to align technical solutions with business objectives. Mentor junior architects and engineers on AWS best practices. Roles and Responsibilities Job Title: Senior Solution Architect - Cloud Migration & Modernization (AWS) Location: [Insert Location] Department: Digital Services Reports To: Cloud SL
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane