Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2 - 6 years
12 - 16 Lacs
Bengaluru
Work from Office
Design, construct, install, test, and maintain highly scalable data management systems using big data technologies such as Apache Spark (with focus on Spark SQL) and Hive. Manage and optimize our data warehousing solutions, with a strong emphasis on SQL performance tuning. Implement ETL/ELT processes using tools like Talend or custom scripts, ensuring efficient data flow and transformation across our systems. Utilize AWS services including S3, EC2, and EMR to build and manage scalable, secure, and reliable cloud-based solutions. Develop and deploy scripts in Linux environments, demonstrating proficiency in shell scripting. Utilize scheduling tools such as Airflow or Control-M to automate data processes and workflows. Implement and maintain metadata-driven frameworks, promoting reusability, efficiency, and data governance. Collaborate closely with DevOps teams utilizing SDLC tools such as Bamboo, JIRA, Bitbucket, and Confluence to ensure seamless integration of data systems into the software development lifecycle. Communicate effectively with both technical and non-technical stakeholders, for handover, incident management reporting, etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Demonstrated expertise in Big Data Technologies, specifically Apache Spark (focus on Spark SQL) and Hive. Extensive experience with AWS services, including S3, EC2, and EMR. Strong expertise in Data Warehousing and SQL, with experience in performance optimization Experience with ETL/ELT implementation (such as Talend) Proficiency in Linux, with a strong background in shell scripting Preferred technical and professional experience Familiarity with scheduling tools like Airflow or Control-M. Experience with metadata-driven frameworks. Knowledge of DevOps tools such as Bamboo, JIRA, Bitbucket, and Confluence. Excellent communication skills and a willing attitude towards learning
Posted 1 month ago
1 - 5 years
6 - 10 Lacs
Pune
Work from Office
About The Role : Job TitleAssociate Engineer Corporate TitleAssociate LocationPune, India Role Description Associate Engineer is responsible for performing development work across multiple areas of the bank's overall IT Platform/Infrastructure including analysis, development, and administration. It may also involve taking functional oversight of engineering delivery for specific departments. Work includes: Planning and developing entire engineering solutions to accomplish business goals Building reliability and resiliency into solutions with appropriate testing and reviewing throughout the delivery lifecycle Ensuring maintainability and reusability of engineering solutions Ensuring solutions are well architected and can be integrated successfully into the end-to-end business process flow Reviewing engineering plans and quality to drive re-use and improve engineering capability Participating in industry forums to drive adoption of innovative technologies, tools and solutions in the Bank. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities This role is for Engineer responsible for design, development and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable and high performing software applications are delivered to users in an Agile development environment. You should be coming from a strong technological background. The candidate should have experience working in Google Cloud Platform. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Your skills and experience Has Java solution design and development experience Has Java Spring Boot development experience Has practical and applied knowledge of design patterns (and anti-patterns) in Java in general and Java Spring Boot specifically Hands on experience working with APIs and microservices, integrating external and internal web services including SOAP, XML, REST, JSON Hands on experience in Google Cloud Platform. Has experience with cloud development platformSpring Cloud; Open Shift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g.; GIT, TeamCity, Maven, SONAR. Experience with software design patterns and UML design Experience in integration design patterns with Kafka Experience with Agile/SCRUM environment. Familiar with Agile Team management tools (JIRA, Confluence) Understand and promote Agile valuesFROCC (Focus, Respect, Openness, Commitment, Courage) Good communication skills Pro-active team player Comfortable working in multi-disciplinary, self-organized teams Professional knowledge of English Differentiatorsknowledge/experience about How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 1 month ago
4 - 9 years
16 - 20 Lacs
Pune
Work from Office
About The Role : Job TitleIT Application Owner, AS LocationPune, India Role Description Deutsche Banks Strategy & Innovation Engineering team identifies, evaluates, and incubates cutting-edge technical innovation. It is part of the Chief Strategy Office of the banks Technology, Data & Innovation (TDI) function and works globally with all business lines and infrastructure functions of the bank. A focus of the team is to create value for clients and the bank using Artificial Intelligence, Large Language Models (LLM) and other advanced data-driven technologies. As a ITAO, you will be joining the innovation engineering team and contribute to the supporting and managing of new AI products and services for the entire Deutsche Bank Group. We require technical specialists to help research, design, and implement state of the art AI services, with particular focus on performing technology evaluations of AI products. You will make a real difference for senior stakeholders across core banking functions where computational, complexity and efficiency challenges abound through your own delivery and through the promotion of modern AI development best practices and techniques. Overview We are seeking a talented and experienced AI Engineer to join our team. The ideal candidate will be hands-on and drive design, development, and implementation of AI based solutions for CB Tech. This role involves working with large datasets, conducting experiments, and staying updated with the latest advancements in AI and Machine Learning. This person is expected to innovate and support the Innovation teams Tech efforts in modernizing the engineering landscape by identifying AI use cases and provide local support by owning ITAO role of AI Platform of Bank. If you are carrying engineering mindset, have a passion for AI and want to be part of developing innovative products then apply today. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities The IT Application owner (ITAO) is responsible for the Application Management and has to ensure that the applications are enhanced and maintained in accordance to the Banks IT Policy requirements on the application lifecycle governance. Design, develop, and deploy solutions using Advanced Analytics, Machine Learning / AI and cloud technologies that fulfil Deutsche Bank's Innovation Strategy Contribute to effective and efficient technical research and experiments, through technology evaluations, publishing AI research reports, and building proof-of-concept (POCs) Engage with business stakeholders to identify and evaluate opportunities to create value through innovative solutions Foster adoption of AI and ML by collaborating with cross-functional teams and educating stakeholders on AI driven solutions Stay up-to-date with the latest advancements in AI and Data Science Ongoing enhancement and maintenance of the application including management of scope. Ensuring that the changes to the applications in scope are fully aligned DB standards and regulations. The main focus is to guarantee the system stability and to ensure a smooth and successful transition to production, steady-state environment. Conducting strategic planning for the application. Managing strategic capacity, consumption and performance management (Forecast and management based on business plans). Ensuring policy-compliance for the application. Facilitating and contributing to the audit activities. Managing software licenses, security certificates and contracts with service providers. Ensuring documentation availability. Identifying and managing technical projects necessary to ensure required and established service levels are maintained. Working with development center team to estimate work effort throughout different phases of the functional domain deliverables. Assisting with development of configuration/monitoring/packaging/deployment/automations of AI Platforms. Identifying, documenting and communicating risks and issues discovered during delivery cycle. Your skills and experience Excellent communication and presentation skills, highly organized and disciplined. Experienced in working with multiple stakeholders. Ability to create and naturally maintain good business relationships with all stakeholders. IT Service Management, IT Governance or IT Project Management background. ITAO, TISO roles awareness, Compliance, Risk and Governance concepts with respect to financial industry. Comfortable working in VUCA (Volatility Uncertainty Complexity Ambiguity) and highly dynamic environments. ITAO will typically have a rather limited technical hands on involvement. A high-level understanding on the products/technologies below is welcomed: Google Cloud GKE, Terraform, IAM, BigQuery, Cloud Shell, Cloud Storage AI/ML AI Agents, AI concepts, ML models, AI/ML Concepts, ,Vertex AI, AutoML, BigQuery ML. MLOps & CICD Pipelines, Kubeflow, Vertex AI pipelines Proficiency in Designing, deploying and managing AI agents e..g chatbots, virtual assistants GCP Networking, Networking protocols, Security concepts, VPC, Load balancers Unix servers very basic administration Python, Shell Scripting, SQL Familiarity with fine-tuning and deploying large language models on GCP. Understanding of security best practices, including data governance, encryption, and compliance with AI-related regulations. GCP - Cloud Logging, Cloud Monitoring and AI Model Performance Tracking. 6+ years of work experience in IT; (for AVP 6+, Associate 4+) Strong problem-solving skills and a passion for AI research Good inter-personal skills with ability to co-operate and collaborate together with other teams Educational Qualifications B.E. / B. Tech. / Master's degree in computer science or equivalent Added advantage. GCP Certifications Kubernetes Certifications AI/Ml Educational background or Certifications or higher qualifications How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 1 month ago
2 - 6 years
5 - 9 Lacs
Hyderabad
Work from Office
AWS Data Engineer: ***************** As an AWS Data Engineer, you will contribute to our client and will have the below responsibilities: Work with technical development team and team lead to understand desired application capabilities. Candidate would need to do development using application development by lifecycles, & continuous integration/deployment practices. Working to integrate open-source components into data-analytic solutions Willingness to continuously learn & share learnings with others Required: 5+ years of direct applicable experience with key focus: Glue and Python; AWS; Data Pipeline creation Develop code using Python, such as o Developing data pipelines from various external data sources to internal data. Use of Glue for extracting data from the design data base. Developing Python APIs as needed Minimum 3 years of hands-on experience in Amazon Web Services including EC2, VPC, S3, EBS, ELB, Cloud-Front, IAM, RDS, Cloud Watch. Able to interpret business requirements, analyzing, designing and developing application on AWS Cloud and ETL technologies Able to design and architect server less application using AWS Lambda, EMR, and DynamoDB Ability to leverage AWS data migration tools and technologies including Storage Gateway, Database Migration and Import Export services. Understands relational database design, stored procedures, triggers, user-defined functions, SQL jobs. Familiar with CI/CD tools e.g., Jenkins, UCD for Automated application deployments Understanding of OLAP, OLTP, Star Schema, Snow Flake Schema, Logical/Physical/Dimensional Data Modeling. Ability to extract data from multiple operational sources and load into staging, Data warehouse, Data Marts etc. using SCDs (Type 1/Type 2/ Type 3/Hybrid) loads. Familiar with Software Development Life Cycle (SDLC) stages in a Waterfall and Agile environment. Nice to have: Familiar with the use of source control management tools for Branching, Merging, Labeling/Tagging and Integration, such as GIT and SVN. Experience working with UNIX/LINUX environments Hand-on experience with IDEs such as Jupiter Notebook Education & Certification University degree or diploma and applicable years of experience Job Segment Developer, Open Source, Data Warehouse, Cloud, Database, Technology
Posted 1 month ago
2 - 5 years
5 - 9 Lacs
Pune
Work from Office
Req ID: 319541 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a SF Sales Cloud Developer to join our team in Pune, Maharashtra (IN-MH), India (IN). "Required Experience: 3-5 Years of Experience in Sales Cloud the responsibilities & technical skills as noted elsewhere. Salesforce Specific Design, Develop, and Implement Solutions & Integrations: Develop custom applications using Apex, Visualforce, and Lightning Web Components (LWC). Implement automation using Salesforce Flow, Process Builder, and Apex triggers. Implement best practices for data management and security. Participate in code reviews and receive/provide constructive feedback. Implement analytics and tracking solutions to gather user behavior data. Configures and Customizes Salesforce EnvironmentsIncluding workflows, security models, automation, and reporting tools Configure Salesforce security models, including roles, profiles, and sharing rules. Implement Salesforce DevOps StrategiesIncluding CI/CD pipelines, version control, and automated deployments. Implement CI/CD pipelines using tools like Copado, Gearset, or Jenkins. Manage version control using Git or similar tools." #Salesforce About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Consulting, Cloud, Information Technology, Programmer, Technology, Sales
Posted 1 month ago
1 - 5 years
1 - 5 Lacs
Chennai
Work from Office
Req ID: 319099 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AWS Developer to join our team in Chennai, Tamil Nadu (IN-TN), India (IN). AWS Developer 7+ years of experience. Primary Skill -C# and Python CDK Here are the various AWS services we need expertise on CDK with Python Amazon AppFlow Step Functions Lambda S3 EventBridge CloudWatch/CloudTrail/XRay GitHub About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Consulting, Information Technology, Programmer, Technology
Posted 1 month ago
6 - 10 years
6 - 15 Lacs
Gurugram
Work from Office
Requirements Elicitation, Understanding, Analysis, & Management Understand the project's Vision and requirements, and contribute to the creation of the supplemental requirements, building the low-level technical specifications for a particular platform and/or service solution. Project Planning, Tracking, & Reporting Estimate the tasks and resources required to design, create (build), and test the code for assigned module(s). Provide inputs in creating the detailed schedule for the project. Support the team in project planning activities, in evaluating risks, and shuffle priorities based on unresolved issues. During development and testing, ensure that assigned parts of the project/modules are on track with respect to schedules and quality. Note scope changes within the assigned modules and work with the team to shuffle priorities accordingly. Communicate regularly with the team about development changes, scheduling, and status. Participate in project review meetings. Tracking and reporting progress for assigned modules Design: Create a detailed (LLD) design for the assigned piece(s) with possible alternate solutions. Ensure that LLD design meets business requirements. Submit the LLD design for review. Fix the detailed (LLD) design for the assigned piece(s) for the comments received from team. Development & Support Build the code of high-priority and complex systems according to the functional specifications, detailed design, maintainability, and coding and efficiency standards. Use code management processes and tools to avoid versioning problems. Ensure that the code does not affect the functioning of any external or internal systems. Perform peer reviews of code to ensure it meets coding and efficiency standards. Act as the primary reviewer to review the application code created by software engineers to ensure compliance to defined standards. Recommend changes to the code as required. Testing & Debugging Attend the Test Design walkthroughs to help verify that the plans and conditions will test all functions and features effectively. Perform impact analysis for issues assigned to self and software engineers. Actively assist with project- and code-level problem solving, such as suggesting paths to explore when testing engineers or software engineers encounter a debugging problem, and escalate urgent issues. Documentation Review technical documentation for the code for accuracy, completeness, and usability. Document and maintain the reviews conducted and the unit test results. Process Management Adhere to the project and support processes. Adhere to best practices and comply with approved policies, procedures, and methodologies, such as the SDLC cycle for different project sizes. Shows responsibility for corporate funds, materials and resources. Ensure adherence to SDLC and audits requirements. Adhere to best practices and comply with approved policies, procedures, and methodologies. Job Description Must have skills: • 7+ years of MySQL DBA experience • MySQL replication experience • SQL coding and tuning experience • Shell scripting and automation experience • MySQL Enterprise Manager experience • AWS/Cloud Services experience: EC2, RDS, and Aurora • Cloud deployments experience: Jenkins, Bitbucket, Terraform • Hands-on, technically adept DBA, capable of performing required DB tasks experience • Strong troubleshooting/performance tuning skills • Ability to assess and improve SQL performance • Understanding of MySQL’s underlying storage engines, such as InnoDB and MyISAM • Ability to Handle common database procedures, such as upgrade, backup, recovery, migration, etc. • Profile server resource usage, optimize and tweak as necessary • Expert level MySQL RDBMS installation/configuration, patching, troubleshooting, performance tracking/tuning, back-up/recovery, remote monitoring skills with hands-on experience in -large and very dynamic environments • Strong commitment to following SDLC/Change-Management principles • Self-starter and be able to perform work with minimal supervisory direction • Knowledge of data quality, data management, and data testing strategies and practices • Strong communication skills, both oral and written • Strong experience working with ticketing tools such as ServiceNow, Zenoss or any other monitoring tool, Cloud monitoring tools (CloudWatch, CloudTrail), AppDynamics (or similar APM tool) • Strong problem-solving and troubleshooting skills • Keen analytical and structured approach to problem solving • Ability to accommodate flexible work schedules Desired skills: • Ability to diagnose problems and triage/resolve issues across various tiers (application, network, database, server, or storage tiers) • Ability to implement automation to reduce manual administrative tasks through use of jobs, scripts, cron, or other techniques • Experience in DB2 UDB, Oracle, Postgres, Microsoft SQL Server, MongoDB is plus • Experience with configuration management tools: SCCM; Puppet; or Ansible. • Development or enhancement of training and upskilling programs • Bachelor’s Degree or similar required You will • Be part of an on-call rotation providing 24x7x365 incident and outage response, including non-business hours changes • Perform initial investigation and/or troubleshooting of database and database system issues to resolve issue & determine root cause • Process all support requests within SLA by following procedural requirements • Escalate to vendors, where necessary, to ensure timely resolution • Thoroughly document steps taken to resolve incidents and fulfil catalog requests within ServiceNow tickets • Develop, document, and update standard operating procedures and knowledgebase articles • Identify, address, and resolve repeating alert trends to improve stability and performance • Suggest defects and product/infrastructure enhancements to improve stability and automation • Participation in periodic skills enhancement sessions and training courses • Be a project lead, as well as a project participant for various activities: cloud migration, lifecycle management, application modernization, capacity planning, financial optimization, performance, and stability optimization, build and design state configuration automation, etc. plication, network, database, server, or storage tiers). Ability to implement automation to reduce manual administrative tasks through use of jobs, shell scripts, PL/SQL, cron, or other techniques. Expert level of SQL Server RDBMS installation/configuration, patching, troubleshooting, performance tracking/tuning, back-up/recovery, remote monitoring skills with hands-on experience in -large and very dynamic environments. Experience with very complex database environments. Experience in SQL and/or PL/SQL. Experience in working controlled environment, change control, and validated systems. Experience with Foglight is a plus Experience on Oracle/MySQL/Postgres/MongoDB/DB2 UDB is a plus Customer service orientation. Location: This position can be based in any of the following locations: Gurgaon For internal use only: R000106310
Posted 1 month ago
7 - 10 years
15 - 25 Lacs
Pune
Remote
8+ yrs of experience in architectural microservices-based applications using JAVA. AWS services, EKS, Lambda, S3, RDS, DynamoDB,Cloud Formation.Proficiency in Java programming with Spring Boot AWS Certified Solutions Architect is mandatory.
Posted 1 month ago
4 - 9 years
18 - 30 Lacs
Hyderabad, India
Hybrid
Department: Software Engineering Employment Type: Full Time Location: India Reporting To: Manoj Puranik Description At Vitech, we believe in the power of technology to simplify complex business processes. Our mission is to bring better software solutions to market, addressing the intricacies of the insurance and retirement industries. We combine deep domain expertise with the latest technological advancements to deliver innovative, user-centric solutions that future-proof and empower our clients to thrive in an ever-changing landscape. With over 1,600 talented professionals on our team, our innovative solutions are recognized by industry leaders like Gartner, Celent, Aite-Novarica, and ISG. We offer a competitive compensation package along with comprehensive benefits that support your health, well-being, and financial security. Location: Hyderabad (Hybrid Role) Role: Full-Stack Java Developer Are you a Java Developer with 4-7+ years of experience eager to elevate your career? At Vitech, we’re looking for a talented professional with a solid background in Core Java who’s ready to make a significant impact. As a Full-Stack Developer at Vitech, you’ll dive deep into backend development while also contributing to frontend work with ReactJS / GWT. Our small, agile pods allow you to spend up to 40% of your time on innovation and writing new software, pushing our products forward. What you will do: Lead and contribute to the full software development lifecycle —from design and coding to testing , deployment , and support . Apply advanced Core Java concepts such as inheritance , interfaces , and abstract classes to solve complex business challenges . Develop and maintain applications across the full stack , with a strong focus on backend development in Java and frontend work using ReactJS or GWT . Collaborate with a cross-functional, high-performing team to deliver scalable , customer-centric solutions . Drive innovation by designing and building software that fuels product enhancements and supports business growth . What We're Looking For: A dvanced Core Java skills with deep expertise in object-oriented programming concepts like inheritance , interfaces , abstract/concrete classes , and control structures Ability to apply these principles to solve complex, business-driven challenges Proficient SQL knowledge with the ability to write and optimize complex queries in relational databases Hands-on experience with Spring Boot , Spring MVC , and Hibernate for backend development Familiarity with REST APIs and microservices architecture Frontend development experience using ReactJS , Angular , or GWT , with the ability to build responsive , user-friendly interfaces and integrate them in a full-stack environment Experience with AWS services such as EC2 , S3 , RDS , Lambda , API Gateway , CloudWatch , and IAM is a plus Strong analytical and problem-solving skills Experience in technical leadership or mentoring is preferred Excellent communication and collaboration skills A commitment to clean, maintainable code and a passion for continuous learning Join Us at Vitech! Career Development: At Vitech, we’re committed to your growth. You’ll have ample opportunities to deepen your expertise in both Java and ReactJS, advancing your career in a supportive environment. Innovative Environment: Work with cutting-edge technologies in an Agile setting where your ideas and creativity are welcomed and encouraged. Impactful Work: Your contributions will be crucial in shaping our products and delivering exceptional solutions to our global clients. At Vitech, you’re not just maintaining software but creating it. At Vitech, you’ll be part of a forward-thinking team that values collaboration, innovation, and continuous improvement. We provide a supportive and inclusive environment where you can grow as a leader while helping shape the future of our organization.
Posted 1 month ago
5 - 10 years
20 - 27 Lacs
Gurugram
Work from Office
Description: . Requirements: A Candidate should have atleast 5+ Years of Expirences. 1. Expecrience in working with GCP or any public Cloud 2. Experience in developing CI/CD pipelines 3. Experience in IaC such as Terraform 4. Experience in scripting lanugages such as bash or python 5. experience in monitoring tools 6. Experience working with containers like docker or kubernetes Job Responsibilities: . What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 1 month ago
- 1 years
3 - 7 Lacs
Bengaluru
Work from Office
Required Experience 0 - 1 Years Skills DataOps img {max-height240px;} Strong proficiency in MySQL 5. x database management Decent experience with recent versions of MySQL Understanding of MySQLs underlying storage engines, such as InnoDB and MyISAM Tuning of MySQL parameters Administration of MySQL and monitoring of performance Experience with master-master replication configuration in MySQL and troubleshootingreplication Proficiency in writing complex queries, stored procedures, and triggers, eventscheduler Strong Linux shell scripting skills Have strong Unix / Shell scripting skills Familiarity with other SQL/NoSQL databases such as MongoDB, etc. desirable.* Install, Deploy and Manage MongoDB on Physical, Virtual, AWS EC2 instances* Should have experience on MongoDB Active Active sharded cluster setup with highavailability* Should have experience in administering MongoDB on the Linux platform* Experience on MongoDB version upgrade, preferably from version 4.0 to 4.4, on aproduction environment with a zero or very minimum application downtime, either withops manager or custom script* Good understanding and experience with MongoDB sharding and Disaster Recoveryplan* Knowledge of Cloud technologies is an added advantageSign in to applyShare this job
Posted 1 month ago
2 - 5 years
4 - 8 Lacs
Pune
Work from Office
About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.
Posted 1 month ago
1 - 4 years
2 - 6 Lacs
Pune
Work from Office
About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.
Posted 1 month ago
2 - 5 years
4 - 8 Lacs
Pune
Work from Office
About The Role Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law
Posted 1 month ago
4 - 9 years
4 - 6 Lacs
Sholinganallur
Remote
You are hereby responsible for ensuring the technical reliability, integration, and optimization of our systems in line with client needs and strategic goals. C# & .Net, SQL & PLSQL, API Integrations, HTML Integration and Development, AWS - EC2 & S3
Posted 1 month ago
3 - 6 years
20 - 25 Lacs
Hyderabad
Work from Office
Overview Job Title: Senior DevOps Engineer Location: Bangalore / Hyderabad / Chennai / Coimbatore Position: Full-time Department: Annalect Engineering Position Overview Annalect is currently seeking a Senior DevOps Engineer to join our technology team remotely, We are passionate about building distributed back-end systems in a modular and reusable way. We're looking for people who have a shared passion for data and desire to build cool, maintainable and high-quality applications to use this data. In this role you will participate in shaping our technical architecture, design and development of software products, collaborate with back-end developers from other tracks, as well as research and evaluation of new technical solutions. Responsibilities Key Responsibilities: Build and maintain cloud infrastructure through terraform IaC. Cloud networking and orchestration with AWS (EKS, ECS, VPC, S3, ALB, NLB). Improve and automate processes and procedures. Constructing CI/CD pipelines. Monitoring and handling incident response of the infrastructure, platforms, and core engineering services. Troubleshooting infrastructure, network, and application issues. Help identify and troubleshoot problems within environment. Qualifications Required Skills 5 + years of DevOps experience 5 + years of hands-on experience in administering cloud technologies on AWS, especially with IAM, VPC, Lambda, EKS, EC2, S3, ECS, CloudFront, ALB, API Gateway, RDS, Codebuild, SSM, Secret Manager, Lambda, API Gateway etc. Experience with microservices, containers (Docker), container orchestration (Kubernetes). Demonstrable experience of using Terraform to provision and configure infrastructure. Scripting ability - PowerShell, Python, Bash etc. Comfortable working with Linux/Unix based operating systems (Ubuntu preferred) Familiarity with software development, CICD and DevOps tools (Bitbucket, Jenkins, GitLab, Codebuild, Codepipeline) Knowledge of writing Infrastructure as Code (laC) using Terraform. Experience with microservices, containers (Docker), container orchestration (Kubernetes), serverless computing (AWS Lambda) and distributed/scalable systems. Possesses a problem-solving attitude. Creative, self-motivated, a quick study, and willing to develop new skills. Additional Skills Familiarity with working with data and databases (SQL, MySQL, PostgreSQL, Amazon Aurora, Redis, Amazon Redshift, Google BigQuery). Knowledge of Database administration. Experience with continuous deployment/continuous delivery (Jenkins, Bamboo). AWS/GCP/Azure Certification is a plus. Experience in python coding is welcome. Passion for data-driven software. All of our tools are built on top of data and require work with data. Knowledge of laaS/PaaS architecture with good understanding of Infrastructure and Web Application security Experience with logging/monitoring (CloudWatch, Datadog, Loggly, ELK). Passion for writing good documentation and creating architecture diagrams.
Posted 1 month ago
5 - 10 years
15 - 30 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
AWS Cloud Engineer CEBE Why HCLTech? HCLTech is a next-generation global technology company that helps enterprises reimagine their businesses for the digital age. Our belief in the values of trust, transparency, flexibility and value-centricity,fueledby our philosophy of 'Employees First', ensures the continued pursuit of our customers' best interests. What is HCLSoftware? HCLSoftwareis the software business division ofHCLTech, fueling the Digital+ Economy by developing, sharing and supporting solutions in five key areas: Business & Industry Applications, Intelligent Operations, Total Experience, Data Analytics and Cybersecurity. Wedevelop, market, sell, and support over 20 product families.We haveoffices and labs around the world to serve thousands of customers. Our mission is to drive customer success with our relentless product innovation at more than 20,000 organizations in every region of the world — including more than half of the Fortune 1000 and Global 2000 companies. Which team will you be working in? You will be working in the Cloud Engineering and Business Experience (CeBe) team within HCLSoftware. The HCLSoftware CeBe team drives the cloud-native strategy for HCL Software. We innovate with new technologies and apply them to the HCLSoftware portfolio. The team is distributed across several locations, in India, Europe and the USA. Senior Software Engineer III Looking for AWS Cloud Engineer who designs, implements, and manages cloud infrastructure on Amazon Web Services (AWS), ensuring high availability, scalability, and performance. Should be familiar with a wide range of AWS services, including compute (EC2), storage (S3, EBS), databases (RDS, DynamoDB), networking (VPC), security (IAM, WAF). Should have Strong hands-on experience and understanding of Node JS, AWS Lambda services, DynamoDB and S3 Storage. Also need to work on other technologies as per need basis. Should have experience with infrastructure-as-code tools like CloudFormation or Terraform. Should mentor and guide large development teams from the technology perspective suggesting multiple solutions to developer issues with a problem-solving mindset. Should be responsible for translating business requirements into technical solutions, focusing on serverless architectures and IaC practices. Should have Cross functional group co-ordination experience like QA, AppOps, Release Engineering etc. Should have Strong oral and written communication skills. Should have Good attitude and eagerness to learn.
Posted 1 month ago
4 - 8 years
15 - 25 Lacs
Chennai, Bengaluru, Hyderabad
Hybrid
Hands-on We are looking for AWS Data Engineer Permanent Role. Experience : 4 to 8 Years Location : Hyderabad / Chennai/Noida/Pune/Bangalore NP-Immediate - Skills: Expertise in Data warehousing and ETL Design and implementation Hands on experience with Programming language like Python Good understanding of Spark architecture along with internals Hand on experience using AWS services like Glue (Pyspark), Lambda, S3, Athena, Experience on Snowflake is good to have Hands on experience on implementing different loading strategies like SCD1 and SCD2, Table/ partition refresh, insert update, Swap Partitions, Experience in Parallel Loading and Dependencies orchestrations Awareness of scheduling and orchestration tools Experience on RDBMS systems and concepts Expertise in writing and complex SQL queries and developing Database components including creating views, stored procedures, triggers etc. Create test cases and perform unit testing of ETL Jobs
Posted 1 month ago
5 - 10 years
14 - 18 Lacs
Bengaluru
Work from Office
Job Title Storage and Backup Architect Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. Design, deploy, implementation of FSxN Storage (Multi AZ) in both SAN and NAS Automate the end to end process of Migration Data Migration tools like cloud sync or Data sync Well versed with any of the scripting or tools like python or terraform(preferred) Drive the Storage strategy in optimization & modernize in terms of cost, efficiency Having good understanding on AWS storage services like EFS, S3, EBS, FSx Should be able to modernize these services, applications, Can suggest how to optimize the cost as these storage services consumes so much, whether we can archive the solution, Can help in integration of storage services in AWS Technical and Professional Requirements: Amazon EBSAmazon EFS and FSxAWS Application Migration Service (MGN)AWS CloudWatchAWS Cloud Migration FactoryAWS Step FunctionsAmazon EBS Multi-Attach Preferred Skills: Storage Technology->Backup Administration->Backup Technologies Technology->Cloud Platform->AWS App Development->Cloudwatch Technology->Infrastructure-Storage-Administration->Cisco-Storage Admin->storage Generic Skills: Technology->Cloud Platform->AWS Core services->Amazon Elastic Compute Cloud(EC2) Additional Responsibilities: Storage Architect - having good understanding on AWS storage services like EFS, S3, EBS, FSx, Educational Requirements Master Of Engineering,Master Of Technology,Bachelor Of Comp. Applications,Bachelor Of Science,Bachelor of Engineering,Bachelor Of Technology Service Line Cloud & Infrastructure Services* Location of posting is subject to business requirements
Posted 1 month ago
6 - 11 years
15 - 30 Lacs
Bengaluru, Hyderabad, Gurgaon
Work from Office
Were Hiring: Sr. AWS Data Engineer – GSPANN Technologies Locations: Bangalore, Pune, Hyderabad, Gurugram Experience: 6+ Years | Immediate Joiners Only Looking for experts in: AWS Services: Glue, Redshift, S3, Lambda, Athena Big Data: Spark, Hadoop, Kafka Languages: Python, SQL, Scala ETL & Data Engineering Apply now: heena.ruchwani@gspann.com #AWSDataEngineer #HiringNow #DataEngineering #GSPANN
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane