Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 5.0 years
5 - 8 Lacs
Chennai
Work from Office
" PLEASE READ THE JOB DESTCRIPTION AND APPLY" Data Engineer Job Description Position Overview Yesterday is history, tomorrow is a mystery, but today is a gift. That's why we call it the present. - Master Oogway Join CustomerLabs' dynamic data team as a Data Engineer and play a pivotal role in transforming raw marketing data into actionable insights that power our digital marketing platform. As a key member of our data infrastructure team, you will design, develop, and maintain robust data pipelines, data warehouses, and analytics platforms that serve as the backbone of our digital marketing product development. Sometimes the hardest choices require the strongest wills. - Thanos (but we promise, our data decisions are much easier! ) In this role, you will collaborate with cross-functional teams including Data Scientists, Product Managers, and Marketing Technology specialists to ensure seamless data flow from various marketing channels, ad platforms, and customer touchpoints to our analytics dashboards and reporting systems. You'll be responsible for building scalable, reliable, and efficient data solutions that can handle high-volume marketing data processing and real-time campaign analytics. What You'll Do: - Design and implement enterprise-grade data pipelines for marketing data ingestion and processing - Build and optimize data warehouses and data lakes to support digital marketing analytics - Ensure data quality, security, and compliance across all marketing data systems - Create data models and schemas that support marketing attribution, customer journey analysis, and campaign performance tracking - Develop monitoring and alerting systems to maintain data pipeline reliability for critical marketing operations - Collaborate with product teams to understand digital marketing requirements and translate them into technical solutions Why This Role Matters: I can do this all day. - Captain America (and you'll want to, because this role is that rewarding!) You'll be the backbone behind the data infrastructure that powers CustomerLabs' digital marketing platform, making marketers' lives easier and better. Your work directly translates to smarter automation, clearer insights, and more successful campaigns - helping marketers focus on what they do best while we handle the complex data heavy lifting. Sometimes you gotta run before you can walk. - Iron Man (and sometimes you gotta build the data pipeline before you can analyze the data! ) Our Philosophy: We believe in the power of data to transform lives, just like the Dragon Warrior transformed the Valley of Peace. Every line of code you write, every pipeline you build, and every insight you enable has the potential to change how marketers work and succeed. We're not just building data systems - we're building the future of digital marketing, one insight at a time. Your story may not have such a happy beginning, but that doesn't make you who you are. It is the rest of your story, who you choose to be. - Soothsayer What Makes You Special: We're looking for someone who embodies the spirit of both Captain America's unwavering dedication and Iron Man's innovative genius. You'll need the patience to build robust systems (like Cap's shield ) and the creativity to solve complex problems (like Tony's suit). Most importantly, you'll have the heart to make a real difference in marketers' lives. Inner peace... Inner peace... Inner peace... - Po (because we know data engineering can be challenging, but we've got your back! ) Key Responsibilities Data Pipeline Development - Design, build, and maintain robust, scalable data pipelines and ETL/ELT processes - Develop data ingestion frameworks to collect data from various sources (databases, APIs, files, streaming sources) - Implement data transformation and cleaning processes to ensure data quality and consistency - Optimize data pipeline performance and reliability Data Infrastructure Management - Design and implement data warehouse architectures - Manage and optimize database systems (SQL and NoSQL) - Implement data lake solutions and data governance frameworks - Ensure data security, privacy, and compliance with regulatory requirements Data Modeling and Architecture - Design and implement data models for analytics and reporting - Create and maintain data dictionaries and documentation - Develop data schemas and database structures - Implement data versioning and lineage tracking Data Quality, Security, and Compliance - Ensure data quality, integrity, and consistency across all marketing data systems - Implement and monitor data security measures to protect sensitive information - Ensure privacy and compliance with regulatory requirements (e.g., GDPR, CCPA) - Develop and enforce data governance policies and best practices Collaboration and Support - Work closely with Data Scientists, Analysts, and Business stakeholders - Provide technical support for data-related issues and queries Monitoring and Maintenance - Implement monitoring and alerting systems for data pipelines - Perform regular maintenance and optimization of data systems - Troubleshoot and resolve data pipeline issues - Conduct performance tuning and capacity planning Required Qualifications Experience - 2+ years of experience in data engineering or related roles - Proven experience with ETL/ELT pipeline development - Experience with cloud data platform (GCP) - Experience with big data technologies Technical Skills - Programming Languages : Python, SQL, Golang (preferred) - Databases: PostgreSQL, MySQL, Redis - Big Data Tools: Apache Spark, Apache Kafka, Apache Airflow, DBT, Dataform - Cloud Platforms: GCP (BigQuery, Dataflow, Cloud run, Cloud SQL, Cloud Storage, Pub/Sub, App Engine, Compute Engine etc.) - Data Warehousing: Google BigQuery - Data Visualization: Superset, Looker, Metabase, Tableau - Version Control: Git, GitHub - Containerization: Docker Soft Skills - Strong problem-solving and analytical thinking - Excellent communication and collaboration skills - Ability to work independently and in team environments - Strong attention to detail and data quality - Continuous learning mindset Preferred Qualifications Additional Experience - Experience with real-time data processing and streaming - Knowledge of machine learning pipelines and MLOps - Experience with data governance and data catalog tools - Familiarity with business intelligence tools (Tableau, Power BI, Looker, etc.) - Experience using AI-powered tools (such as Cursor, Claude, Copilot, ChatGPT, Gemini, etc.) to accelerate coding, automate tasks, or assist in system design ( We belive run with machine, not against machine ) Interview Process 1. Initial Screening: Phone/video call with HR 2. Technical Interview: Deep dive into data engineering concepts 3. Final Interview: Discussion with senior leadership Note: This job description is intended to provide a general overview of the position and may be modified based on organizational needs and candidate qualifications. Our Team Culture We are Groot. - We work together, we grow together, we succeed together. We believe in: - Innovation First - Like Iron Man, we're always pushing the boundaries of what's possible - Team Over Individual - Like the Avengers, we're stronger together than apart - Continuous Learning - Like Po learning Kung Fu, we're always evolving and improving - Making a Difference - Like Captain America, we fight for what's right (in this case, better marketing!) Growth Journey There is no charge for awesomeness... or attractiveness. - Po Your journey with us will be like Po's transformation from noodle maker to Dragon Warrior: - Level 1 : Master the basics of our data infrastructure - Level 2: Build and optimize data pipelines - Level 3 : Lead complex data projects and mentor others - Level 4: Become a data engineering legend (with your own theme music! ) What We Promise I am Iron Man. - We promise you'll feel like a superhero every day! - Work that matters - Every pipeline you build helps real marketers succeed - Growth opportunities - Learn new technologies and advance your career - Supportive team - We've got your back, just like the Avengers - Work-life balance - Because even superheroes need rest!
Posted 1 week ago
1.0 - 4.0 years
9 - 13 Lacs
Pune
Work from Office
Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. Responsibilities Implement & Maintain Data Catalogs Deploy and manage data catalog tool Collibra to improve data discoverability and governance. Metadata & Lineage Management Automate metadata collection, establish data lineage, and maintain consistent data definitions across systems. Enable Data Governance Collaborate with governance teams to apply data policies, classifications, and ownership structures in the catalog. Support Self-Service & Adoption Promote catalog usage across teams through training, documentation, and continuous support. Cross-Team Collaboration Work closely with data engineers, analysts, and stewards to align catalog content with business needs. Tooling & Automation Build scripts and workflows for metadata ingestion, tagging, and monitoring of catalog health. Leverage AI tools for automation of cataloging activities Reporting & Documentation Maintain documentation and generate usage metrics, ensuring transparency and operational efficiency. Qualifications Self-motivated, collaborative individual with passion for excellence E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Azure DevOps tools and technologies Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Tools: Experience with data catalog platforms (e.g., Collibra, Alation, DataHub). Metadata & Lineage: Understanding of metadata management and data lineage. Scripting: Proficient in SQL and Python for automation and integration. APIs & Integration: Ability to connect catalog tools with data sources using APIs. Cloud Knowledge: Familiar with cloud data services (Azure, GCP). Data Governance: Basic knowledge of data stewardship, classification, and compliance. Collaboration: Strong communication skills to work across data and business teams What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 weeks ago
3.0 - 5.0 years
13 - 17 Lacs
Gurugram
Work from Office
Senior Analyst-GCP Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Data Analytics (DA) Data Analytics is one of the highest growth practices within Evalueserve, providing you rewarding career opportunities. Established in 2014, the global DA team extends beyond 1000+ (and growing) data science professionals across data engineering, business intelligence, digital marketing, advanced analytics, technology, and product engineering. Our more tenured teammates, some of whom have been with Evalueserve since it started more than 20 years ago, have enjoyed leadership opportunities in different regions of the world across our seven business lines. What you will be doing at Evalueserve Data Pipeline Development: Design and implement scalable ETL (Extract, Transform, Load) pipelines using tools like Cloud Dataflow, Apache Beam or Spark and BigQuery. Data Integration: Integrate various data sources into unified data warehouses or lakes, ensuring seamless data flow. Data Transformation: Transform raw data into analyzable formats using tools like dbt (data build tool) and Dataflow. Performance Optimization: Continuously monitor and optimize data pipelines for speed, scalability, and cost-efficiency. Data Governance: Implement data quality standards, validation checks, and anomaly detection mechanisms. Collaboration: Work closely with data scientists, analysts, and business stakeholders to align data solutions with organizational goals. Documentation: Maintain detailed documentation of workflows and adhere to coding standards. What were looking for Proficiency in **Python/PySpark and SQL for data processing and querying. Expertise in GCP services like BigQuery, Cloud Storage, Pub/Sub, Cloud composure and Dataflow. Familiarity with Datawarehouse and lake house principles and distributed data architectures. Strong problem-solving skills and the ability to handle complex projects under tight deadlines. Knowledge of data security and compliance best practices. Certification: GCP Professional Data engineer. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking onachievements AI-poweredsupply chain optimization solution built on Google Cloud. HowEvalueserve isnow Leveraging NVIDIA NIM to enhance our AI and digital transformationsolutions and to accelerate AI Capabilities . Knowmore about how Evalueservehas climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer : The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 3 weeks ago
0.0 - 5.0 years
2 - 7 Lacs
Bengaluru
Work from Office
Teach English during school hoursTeach Science during school hoursBecome a mentor for studentsTutor 10th grade students after schoolProvide career counseling to 10th grade studentsConduct extra curricular activities at School and VF study centre
Posted 3 weeks ago
0.0 - 2.0 years
2 - 4 Lacs
Hyderabad
Work from Office
- Experience with containerization (Docker, Kubernetes). - Knowledge of cloud platforms (AWS, Azure, GCP). - Hands-on experience implementing GenAI RAG solutions using LangChain, Langraph, Llamaindex in Python. - Experience in developing AI/ML solutions utilizing Cloud APIs. **Responsibilities:** As a Full Stack Developer, you will be responsible for: 1. ** Backend Development: ** - Design, develop, and maintain server-side logic using Python. - Collaborate with other team members to integrate user-facing elements with server-side logic. 2. * *Frontend Development :** - Develop efficient and responsive user interfaces using HTML, CSS, and JavaScript frameworks. - Ensure the technical feasibility of UI/UX designs. 3. ** Database Management :** - Design and implement database models and schemas. - Optimize database performance and ensure data security. 4. ** API Development :** - Create robust and scalable RESTful APIs. - Collaborate with frontend developers to integrate user-facing elements using server-side logic. 5. ** AI ML Solutions :** - Implement GenAI RAG solutions using LangChain, Langraph, Llamaindex in Python. - Develop AI/ML solutions utilizing Cloud APIs. 6. ** Testing and Debugging :** - Conduct thorough testing of applications, identify and resolve bugs and performance bottlenecks. 7. * *Collaboration :** - Work closely with cross-functional teams to understand project requirements and deliver high-quality solutions. **Requirements:** - Bachelors degree in Computer Science, Engineering, or related field. - Strong knowledge of Python web frameworks (Django, Flask). - Experience with frontend technologies such as HTML, CSS, JavaScript, and related frameworks (React, Angular, or Vue.js). - Proficiency in database systems (SQL, MongoDB, etc.). - Familiarity with version control systems (Git). - Excellent problem-solving and communication skills. - Ability to work independently and collaborate effectively in a team environment.
Posted 3 weeks ago
4.0 - 8.0 years
11 - 13 Lacs
Pune
Work from Office
Strong hands-on experience in test automation using Python (mandatory). Proficiency in C# is a plus. Familiarity with Cloud/IoT technologies and CI/CD concepts, tools, and workflows. Experience with cloud connectivity protocols such as MQTT, LWM2M, and WebSocket. Knowledge of API testing using Python automation frameworks. Exposure to Azure Cloud services is desirable. Mandatory Skills (as per customer requirements): Python automation and API testing using Python. Azure Cloud knowledge good to have.
Posted 3 weeks ago
7.0 - 12.0 years
40 - 45 Lacs
Pune
Work from Office
: Job Title - Data Platform Engineer - Tech Lead Location - Pune, India Role Description DB Technology is a global team of tech specialists, spread across multiple trading hubs and tech centers. We have a strong focus on promoting technical excellence our engineers work at the forefront of financial services innovation using cutting-edge technologies. DB Pune location plays a prominent role in our global network of tech centers, it is well recognized for its engineering culture and strong drive to innovate. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create best solutions for the financial markets. CB Data Services and Data Platform We are seeking an experienced Software Engineer with strong leadership skills to join our dynamic tech team. In this role, you will lead a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc and data management. You will be responsible for overseeing the development of robust data pipelines, ensuring data quality, and implementing efficient data management solutions. Your leadership will be critical in driving innovation, ensuring high standards in data infrastructure, and mentoring team members. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. Join us in building and scaling our tech solutions including hybrid data platform to unlock new insights and drive business growth. If you are passionate about data engineering, we want to hear from you! Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Technical Leadership: Lead a cross-functional team of engineers in the design, development, and implementation of on prem and cloud-based data solutions. Provide hands-on technical guidance and mentorship to team members, fostering a culture of continuous learning and improvement. Collaborate with product management and stakeholders to define technical requirements and establish delivery priorities. . Architectural and Design Capabilities: Architect and implement scalable, efficient, and reliable data management solutions to support complex data workflows and analytics. Evaluate and recommend tools, technologies, and best practices to enhance the data platform. Drive the adoption of microservices, containerization, and serverless architectures within the team. Quality Assurance: Establish and enforce best practices in coding, testing, and deployment to maintain high-quality code standards. Oversee code reviews and provide constructive feedback to promote code quality and team growth. Your skills and experience Technical Skills: Bachelor's or Masters degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with a focus on Big Data and GCP technologies such as Hadoop, PySpark, Terraform, BigQuery, DataProc and data management. Proven experience in leading software engineering teams, with a focus on mentorship, guidance, and team growth. Strong expertise in designing and implementing data pipelines, including ETL processes and real-time data processing. Hands-on experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Hands on experience with cloud platform particularly Google Cloud Platform (GCP), and its data management services (e.g., Terraform, BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage). Solid understanding of data quality management and best practices for ensuring data integrity. Familiarity with containerization and orchestration tools such as Docker and Kubernetes is a plus. Excellent problem-solving skills and the ability to troubleshoot complex systems. Strong communication skills and the ability to collaborate with both technical and non-technical stakeholders Leadership Abilities: Proven experience in leading technical teams, with a track record of delivering complex projects on time and within scope. Ability to inspire and motivate team members, promoting a collaborative and innovative work environment. Strong problem-solving skills and the ability to make data-driven decisions under pressure. Excellent communication and collaboration skills. Proactive mindset, attention to details, and constant desire to improve and innovate. How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 3 weeks ago
1.0 - 5.0 years
1 - 4 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Cloud Automation Engineer Job Title : Cloud Automation Engineer Location : Chennai, Hyderabad, Bangalore Experience : 1-5 Job Summary The Cloud Automation Engineer develops and maintains automation solutions to streamline cloud operations and deployments. Key Responsibilities Automate infrastructure provisioning and configuration. Develop CI/CD pipelines for cloud-native applications. Create scripts and tools for operational efficiency. Integrate monitoring and alerting into automated workflows. Ensure automation aligns with security and compliance standards. Required Skills Proficiency in scripting (Python, Bash, PowerShell). Experience with automation tools (Terraform, Ansible, Jenkins). Familiarity with cloud APIs and SDKs. Knowledge of DevOps practices and tools.
Posted 1 month ago
6.0 - 11.0 years
4 - 8 Lacs
Bengaluru
Work from Office
React Developer Responsibilities Developing new user-facing features using React.js Building reusable components and front-end libraries for future use Translate User Stories and wireframes into high quality code Create applications which provide fantastic UI/UX and responsive design Integrate apps with third-party APIs and Cloud APIs Apply core Computer Science concepts to improve consumer web apps Profile and improve our frontend performance Design for scalability and adherence to standards Required Skills: Should be excellent in UI development using React framework Should be strong in Redux or Flux Should be strong in JavaScript (ES 6 and above standards)
Posted 1 month ago
4.0 - 6.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Minimum 3.5 years experience with Having experience on end-to-end Mulesoft Integration (Anypoint platform) experience with various systems/applications SAAS, Legacy system, DB, Webservices(SOAP & REST) Knowledge on integration design patterns Hands on experience in using Mule connectors like Salesforce, FTP, FILE, sFTP, IMAP, Database, HTTP etc. Experience in developing middle tier applications using ESB Mule ( API and batch processing ) Experience in RDBMS SQL queries , functions & Stored procedure Strong knowledge in data transformations using Mulesoft Dataweave and exception handling . Hands on experience with Mule 4, RAML 1.0, Maven, MUnits current version of Mulesoft Anypoint studio, Anypoint platform in a cloud implementation or On-prem or Runtime Fabric Security , Logging , Auditing, Policy Management and Performance Monitoring and KPI for end-to-end process execution Experience with Mulesoft, Java integration Basic knowledge on java Intermediate level knowledge in working with Web-services technologies ( XML, SOAP, REST, XSLT ) and CLOUD API Basic knowledge on Salesforce in a Cloud Implementation Other Qualifications Familiarity with Agile (Scrum) project management methodology nice to have Familiarity with Salesforce in a cloud implementation Familiarity with Microsoft Office suite including Visio, draw.io If you are interested, please Share below details and Updated Resume Matched First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer
Posted 1 month ago
5.0 - 10.0 years
9 - 19 Lacs
Pune, Chennai, Bengaluru
Hybrid
Project Role : Cloud Platform Architect Project Role Description : Oversee application architecture and deployment in cloud platform environments -- including public cloud, private cloud and hybrid cloud. This can include cloud adoption plans, cloud application design, and cloud management and monitoring. Must have skills : Google Cloud Platform Architecture Summary: As a Cloud Platform Architect, you will be responsible for overseeing application architecture and deployment in cloud platform environments, including public cloud, private cloud, and hybrid cloud. Your typical day will involve designing cloud adoption plans, managing and monitoring cloud applications, and ensuring cloud application design meets business requirements. Roles & Responsibilities: - Design and implement cloud adoption plans, including public cloud, private cloud, and hybrid cloud environments. - Oversee cloud application design, ensuring it meets business requirements and aligns with industry best practices. - Manage and monitor cloud applications, ensuring they are secure, scalable, and highly available. - Collaborate with cross-functional teams to ensure cloud applications are integrated with other systems and services. - Stay up-to-date with the latest advancements in cloud technology, integrating innovative approaches for sustained competitive advantage. Professional & Technical Skills: - Must To Have Skills: Strong experience in Google Cloud Platform Architecture. - Good To Have Skills: Experience with other cloud platforms such as AWS or Azure. - Experience in designing and implementing cloud adoption plans. - Strong understanding of cloud application design and architecture. - Experience in managing and monitoring cloud applications. - Solid grasp of cloud security, scalability, and availability best practices.
Posted 1 month ago
10.0 - 15.0 years
10 - 15 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
We are seeking a highly experienced and versatile Staff Solutions Test Engineer with deep expertise in Cloud-Native product testing and experience with on-premise solutions. The ideal candidate will possess strong automation and scripting skills, particularly in Python or Golang, and a proven track record in designing, executing, and troubleshooting complex test scenarios across various cloud and data-centric environments. This role requires a strong technical background and an ability to work independently in a dynamic setting. Key Responsibilities: Test Automation & Scripting: Design, develop, and maintain robust test automation frameworks and scripts using Python or Golang . Implement automation solutions for cloud-native products, cloud APIs, and on-premise systems. Cloud-Native Product Testing: Lead and execute comprehensive testing of Cloud-Native products , ensuring their functionality, performance, scalability, and reliability. Conduct Cloud API testing and troubleshooting , leveraging tools like Fiddler/Charles and ELK stack for in-depth analysis and debugging. AWS-based DevOps & Solution Design: Contribute to AWS-based DevOps, test, and solution design , ensuring testability and quality from the early stages of development. Work within CI/CD pipelines to integrate and automate testing processes. Data-Centric Product Testing: Test Data analysis products, BI systems, and data mining products , verifying data accuracy, consistency, and reporting integrity. Potentially contribute to building pipelines to ingest terabytes of data spanning billions of rows , and ensure their quality and reliability. Troubleshooting & Debugging: Perform advanced troubleshooting and debugging using a range of tools, including ELK stack tools, Fiddler, and Charles . Collaboration & Mentorship: Collaborate effectively with development, DevOps, and product teams to ensure comprehensive test coverage and high-quality deliverables. Mentor junior engineers and contribute to best practices in testing. Desired/Added Advantage Skills & Experience: Performance & Network Testing Tools: Experience with Spirent Testcenter and/or Ixia Network for advanced network and protocol testing. IoT & Smart Home: Experience with smart home deployment and user cases , including testing smart home devices and ecosystems. Cloud Security Testing: Expertise in Cloud security testing , identifying vulnerabilities and ensuring secure cloud deployments. Specialized Equipment Testing: Experience with Equipment testing in IP/Video/FE/Client Ethernet/IPv6 , and xPON products testing . Pre-sales Experience: Prior pre-sales experience in a technical or solutions role, demonstrating the ability to articulate technical solutions and value propositions to clients. Required Skills & Qualifications: Total Years of Experience: 10+ Years. Relevant Years of Experience: 6+ Years. Mandatory Skills: Python or Golang (for test automation and scripting), Test Automation (general), Cloud-Native Product Testing, Cloud API Testing, Troubleshooting/Debugging (with ELK stack tools/Fiddler/Charles).
Posted 1 month ago
5.0 - 7.0 years
10 - 15 Lacs
Bengaluru
Work from Office
5+ years in embedded software AOSP, HMI. Deep knowledge of Agile, working in cross-functional teams. DevOps, CI/CD, cloud APIs. Able to balance feature delivery with regulatory and compliance needs.
Posted 1 month ago
5.0 - 8.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary Engineering Tools and Services organization responsible in bringing efficiency and consistency in the way we automate, execute, triage tests and report results. We support the core ONTAP product team supporting more than 2500 engineers. About the Team Software Tools and Build Systems Engineer with developing and supporting software builds, Build operations or software tools in a UNIX environment. In this position you will work as part of the team developing and enhancing NetApp’s Best-in-Class build system, driving improvements in the development environment and improved development productivity by improving the tools, build architecture and processes. Job Responsibilities and Requirements • Build and maintain software versions regularly for multiple platforms like AWS, GCP, Azure, and IBM, ensuring timely updates and releases. • Create and use tools to automate repetitive tasks, making processes faster and more reliable. • Monitor systems to ensure they are running smoothly, identifying and fixing issues before they become problems. • Respond to and resolve technical issues quickly to minimize disruptions. • Work closely with different teams to ensure projects and product releases are completed on schedule. Technical Skills • Strong Programming skills Go/ Perl/Python. • Familiarity with OO design, Web development, and Cloud APIs • Experience in Linux Environment with containers ( Docker & Kubernetes) • Familiarity with Agile concepts , Continous Integration and Continous Delivery • Creative analytical approach to problem solving. Education • A minimum of 4 years of experience is required. 5-8 years of experience is preferred. • A Bachelor of Science Degree in Electrical Engineering or Computer Science, or a Master Degree; or equivalent experience is required.
Posted 1 month ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are hiring a Cloud Operations Engineer to manage and optimize cloud-based environments. Ideal for engineers passionate about automation, monitoring, and cloud-native technologies. Key Responsibilities: Maintain cloud infrastructure (AWS, Azure, GCP) Automate deployments and system monitoring Ensure availability, performance, and cost optimization Troubleshoot incidents and resolve system issues Required Skills & Qualifications: Hands-on experience with cloud platforms and DevOps tools Proficiency in scripting (Python, Bash) and IaC (Terraform, CloudFormation) Familiarity with logging/monitoring tools (CloudWatch, Datadog, etc.) Bonus: Experience with Kubernetes or serverless architectures Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 2 months ago
6.0 - 8.0 years
40 - 45 Lacs
Pune
Work from Office
: Job Title - Data Platform Engineer - Tech Lead Location - Pune, India Role Description DB Technology is a global team of tech specialists, spread across multiple trading hubs and tech centers. We have a strong focus on promoting technical excellence our engineers work at the forefront of financial services innovation using cutting-edge technologies. DB Pune location plays a prominent role in our global network of tech centers, it is well recognized for its engineering culture and strong drive to innovate. We are committed to building a diverse workforce and to creating excellent opportunities for talented engineers and technologists. Our tech teams and business units use agile ways of working to create best solutions for the financial markets. CB Data Services and Data Platform We are seeking an experienced Software Engineer with strong leadership skills to join our dynamic tech team. In this role, you will lead a group of engineers working on cutting-edge technologies in Hadoop, Big Data, GCP, Terraform, Big Query, Data Proc and data management. You will be responsible for overseeing the development of robust data pipelines, ensuring data quality, and implementing efficient data management solutions. Your leadership will be critical in driving innovation, ensuring high standards in data infrastructure, and mentoring team members. Your responsibilities will include working closely with data engineers, analysts, cross-functional teams, and other stakeholders to ensure that our data platform meets the needs of our organization and supports our data-driven initiatives. Join us in building and scaling our tech solutions including hybrid data platform to unlock new insights and drive business growth. If you are passionate about data engineering, we want to hear from you! Deutsche Banks Corporate Bank division is a leading provider of cash management, trade finance and securities finance. We complete green-field projects that deliver the best Corporate Bank - Securities Services products in the world. Our team is diverse, international, and driven by shared focus on clean code and valued delivery. At every level, agile minds are rewarded with competitive pay, support, and opportunities to excel.You will work as part of a cross-functional agile delivery team. You will bring an innovative approach to software development, focusing on using the latest technologies and practices, as part of a relentless focus on business value. You will be someone who sees engineering as team activity, with a predisposition to open code, open discussion and creating a supportive, collaborative environment. You will be ready to contribute to all stages of software delivery, from initial analysis right through to production support. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Technical Leadership: Lead a cross-functional team of engineers in the design, development, and implementation of on prem and cloud-based data solutions. Provide hands-on technical guidance and mentorship to team members, fostering a culture of continuous learning and improvement. Collaborate with product management and stakeholders to define technical requirements and establish delivery priorities. . Architectural and Design Capabilities: Architect and implement scalable, efficient, and reliable data management solutions to support complex data workflows and analytics. Evaluate and recommend tools, technologies, and best practices to enhance the data platform. Drive the adoption of microservices, containerization, and serverless architectures within the team. Quality Assurance: Establish and enforce best practices in coding, testing, and deployment to maintain high-quality code standards. Oversee code reviews and provide constructive feedback to promote code quality and team growth. Your skills and experience Technical Skills: Bachelor's or Masters degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with a focus on Big Data and GCP technologies such as Hadoop, PySpark, Terraform, BigQuery, DataProc and data management. Proven experience in leading software engineering teams, with a focus on mentorship, guidance, and team growth. Strong expertise in designing and implementing data pipelines, including ETL processes and real-time data processing. Hands-on experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, Pig, and Spark. Hands on experience with cloud platform particularly Google Cloud Platform (GCP), and its data management services (e.g., Terraform, BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Storage). Solid understanding of data quality management and best practices for ensuring data integrity. Familiarity with containerization and orchestration tools such as Docker and Kubernetes is a plus. Excellent problem-solving skills and the ability to troubleshoot complex systems. Strong communication skills and the ability to collaborate with both technical and non-technical stakeholders Leadership Abilities: Proven experience in leading technical teams, with a track record of delivering complex projects on time and within scope. Ability to inspire and motivate team members, promoting a collaborative and innovative work environment. Strong problem-solving skills and the ability to make data-driven decisions under pressure. Excellent communication and collaboration skills. Proactive mindset, attention to details, and constant desire to improve and innovate. How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 months ago
5.0 - 10.0 years
1 - 4 Lacs
Chennai, Tiruchirapalli
Work from Office
Job Title:SFMC Developer Experience5-10 Years Location:Chennai, Trichy (Hybrid) : We are looking for a Salesforce Marketing Cloud (SFMC) Engagement Developer to join our dynamic team. The ideal candidate will have expertise in configuring, developing, and implementing solutions within SFMC, with a strong focus on Journey Builder, Automation Studio, Email Studio, and AMPscript. The role requires a deep understanding of customer segmentation, personalization, and multi-channel campaign execution to enhance engagement and drive business growth. Key Responsibilities: Develop and implement personalized, automated customer journeys using Journey Builder and Automation Studio. Design and optimize email templates using AMPscript, HTML, and CSS for dynamic content personalization. Work with Marketing Cloud APIs to integrate external data sources and enable seamless data synchronization. Leverage Data Extensions, SQL, and Contact Builder to manage and segment audiences efficiently. Collaborate with marketing, data, and business teams to translate requirements into SFMC solutions. Implement triggered sends, transactional messaging, and audience targeting to improve engagement. Monitor campaign performance, troubleshoot issues, and optimize workflows for efficiency. Ensure compliance with email marketing best practices, CAN-SPAM, and GDPR regulations. Required Skills & Qualifications: 5+ years of hands-on experience in Salesforce Marketing Cloud (SFMC) development. Proficiency in Journey Builder, Automation Studio, Email Studio, and Content Builder. Strong expertise in AMPscript, SQL, and SSJS (Server-Side JavaScript) for dynamic content and automation. Experience with REST and SOAP APIs for integrating SFMC with external systems. Understanding of Data Extensions, Contact Builder, and Audience Segmentation. Experience in multi-channel campaign execution (Email, SMS, Push, Advertising Studio). Ability to analyze marketing data and optimize customer engagement strategies. Strong problem-solving skills with a data-driven approach to campaign improvements. Familiarity with Salesforce CRM, Data Cloud, and Marketing Cloud Intelligence (Datorama) is a plus. Salesforce Marketing Cloud Developer Certification is highly desirable. If you are passionate about data-driven marketing and automation and have the technical skills to deliver personalized customer experiences, we’d love to hear from you!
Posted 2 months ago
8.0 - 12.0 years
14 - 19 Lacs
Bengaluru
Work from Office
In this Software Architect role, you will: Provide day to day technical leadership to IC4V development teams, deliver detailed technical design and lead technical deliverable implementation to IBM Cloud clients’ satisfaction on-time and with quality, and deployment SLAB. Interface with offering managers to understand customer requirements to assess and design software solution architecture design to meet customer requirements that hunts the market and beat competition. Provide technical sizing reference to support release managers and development managers’ planning and prioritization. Work with across IBM business units, IBM Cloud IaaS teams and 3rd party vendors and suppliers to co-design and co-develop IC4V solutions with value-add services to enrich client experience. Use deep expertise and experience with strategic technical insight on Cloud infrastructure, Cloud computing, Automation to build IBM IC4V solutions to help expedite clients’ path to IBM Cloud. Stay abreast of emerging technology trends and solutions on cloud infrastructure, cloud compute, VMWare, and competition. Interface to executives to provide technical references and serve as a liaison among technical resources in IBM, external partners and vendors, and stakeholders. Mentor and develop team members and technical community. This role requires strong communication across the IC4V team and global IBM teams across different functions, a deep understanding of IBM's cloud technologies and VMware offerings, and outstanding technical leadership to define architecture, design to drive new releases and new solutions to deliver on very aggressive schedules hunts in the market. This role also will interact with many internal teams and external partner and vendor companies for optimized integrated design for best in class customer experience meeting business and compliance requirements. It requires exception strong development background, rich experience in IT industry gained through many years in different business units and roles, through different products, architecture, markets and good experience working with clients. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Architect and design end-to-end VMware Cloud Foundation (VCF) solutions, including vSphere, NSX-T, vSAN, vCenterandVMware Cloud Director (VCD) Develop high-level and low-level designs for VMware-based infrastructure. Ensure VMware solutions align with business requirements, security policies, and industry best practices. Work closely with partners and vendors communicating to them architectural vision and priorities to align technical solutions with business needs Deploy and configure VMware NSX for network virtualization and micro-segmentation. Implement and optimize vSAN for hyper-converged infrastructure solutions. Manage and administer vCenter Server for virtualization management and automation. Oversee the deployment and integration of VMware components in hybrid/multi-cloud environments. Monitor, analyze, and fine-tune VMware infrastructure for high performance and availability. Optimize vSAN storage policies for efficient data management and disaster recovery. Troubleshoot performance bottlenecks and implement corrective measures. Provide technical guidance and mentorship to junior engineers and system administrators. Lead troubleshooting and root cause analysis for VMware-related issues Implement security best practices in NSX, ensuring micro-segmentation and zero-trust policies. Configure role-based access controls (RBAC) and multi-tenancy in VMware environments. Leverage PowerCLI, vRealize Automation (vRA), and Terraform for automation and infrastructure as code (IaC). Develop scripts and workflows to streamline deployments, backups, and monitoring. Work closely with cross-functional teams, including network, storage, and security teams. Preferred technical and professional experience VMware Certified Professional (VCP), VCIX, or VCDX certification. Experience software-defined data centers (SDDC). Knowledge of disaster recovery solutions like VMware SRM (Site Recovery Manager). Experience in integrating VMware solutions with DevOps pipelines. Extensive experience with resiliency and data protection technologiesVeeam, Zerto, or Equivalent Extensive experience leveraging and designing solutions with Microsoft Active Directory Experience with IBM Cloud API integration development Experience with Kubernetes, Docker, and RedHat OpenShift
Posted 2 months ago
1 - 3 years
3 - 7 Lacs
Bengaluru
Work from Office
Our organization offers a highly visible, large-scale enterprise platform delivering standardized and automated VMware virtualization solutions on IBM Cloud including a growing number of high value add-on services and solutions. We bring together IBM Cloud infrastructure and VMware technologies to simplify configuration and management with ease of moving VMware workloads from on-premise to the IBM Cloud. In our fast paced and expanding organization, we foster an environment of continuous innovation and working in agile teams, to deliver the latest technology and provide excellent support to our clients. We are hiring motivated software engineers to be part of our dynamic development team to build and maintain these automation solutions, including a growing number of high value add-on services and solutions. Qualified candidates will be responsible for: - Developing and maintaining automation of software solutions in VMware environments. - Writing and executing unit, functional, and integration testcases. - Support integrating enterprise-grade complex VMware cloud environments. - Following compliant procedures and secure engineering best practices. The role is very important in our ability to deliver valuable automated and integrated solutions in IBM Cloud. In our fast-paced and expanding organization, we foster an environment of continuous innovation and working in agile teams, to deliver the latest technology and provide excellent support to our clients. Visit our VMware Solutions page (https://www.ibm.com/cloud/vmware) for more details on our offerings on IBM Cloud. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5-7 years of experience with Python, system automation, scripting, and development Postgres/anyRDBMSand Cloudant databases software engineering and testing 3-5 years of experience with Linux system administration or development Good communication skills in English Preferred technical and professional experience 5+ years of experience with RESTful APIs Experience with GitHub issue and code management, Ansible, VMware administration, VMware API integration development, vRealize Operations/LogInsights 3+ years of experience with Cloud networking, including VMware NSX andIBM Cloud API integration development ,Windows Active Directory administration or development Windows SQL server administration or development Jenkins build and platforms Web UI development Networking, and network security components, firewalls, gateways Secure engineering principles and best practices Security standards, authentication, authorization, and encryption protocols
Posted 2 months ago
5 - 10 years
9 - 19 Lacs
Chennai, Bengaluru, Mumbai (All Areas)
Hybrid
Google BigQuery Location- Pan India Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Key Responsibilities : Analyze and model client market and key performance data Use analytical tools and techniques to develop business insights and improve decisionmaking \n1:Data Proc PubSub Data flow Kalka Streaming Looker SQL No FLEX\n2:Proven track record of delivering data integration data warehousing soln\n3: Strong SQL And Handson Pro in BigQuery SQL languageExp in Shell Scripting Python No FLEX\n4:Exp with data integration and migration projects Oracle SQL Technical Experience : Google BigQuery\n\n1: Expert in Python NO FLEX Strong handson knowledge in SQL NO FLEX Python programming using Pandas NumPy deep understanding of various data structure dictionary array list tree etc experiences in pytest code coverage skills\n2: Exp with building solutions using cloud native services: bucket storage Big Query cloud function pub sub composer and Kubernetes NO FLEX\n3: Pro with tools to automate AZDO CI CD pipelines like ControlM GitHub JIRA confluence CI CD Pipeline Professional Attributes :
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough