Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 8.0 years
0 Lacs
karnataka
On-site
You are an experienced Senior QA Specialist being sought to join a dynamic team for a critical AWS to GCP migration project. Your primary responsibility will involve the rigorous testing of data pipelines and data integrity in GCP cloud to ensure seamless reporting and analytics capabilities. Your key responsibilities will include designing and executing test plans to validate data pipelines re-engineered from AWS to GCP, ensuring data integrity and accuracy. You will work closely with data engineering teams to understand AVRO, ORC, and Parquet file structures in AWS S3, and analyze the data in external tables created in Athena used for reporting. It will be essential to ensure that schema and data in Bigquery match against Athena to support reporting in PowerBI. Additionally, you will be required to test and validate Spark pipelines and other big data workflows in GCP. Documenting all test results and collaborating with development teams to resolve discrepancies will also be part of your responsibilities. Furthermore, providing support to UAT business users during UAT testing is expected. To excel in this role, you should possess proven experience in QA testing within a big data DWBI ecosystem. Strong familiarity with cloud platforms such as AWS, GCP, or Azure, with hands-on experience in at least one is necessary. Deep knowledge of data warehousing solutions like BigQuery, Redshift, Synapse, or Snowflake is essential. Expertise in testing data pipelines and understanding different file formats like Avro and Parquet is required. Experience with reporting tools such as PowerBI or similar is preferred. Your excellent problem-solving skills and ability to work independently will be valuable, along with strong communication skills and the ability to collaborate effectively across teams.,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
As a Product Owner for the GCP Data Migration Project at Clairvoyant, you will play a crucial role in leading the initiative and ensuring successful delivery of data migration solutions on Google Cloud Platform. With your deep understanding of cloud platforms, data migration processes, and Agile methodologies, you will collaborate with cross-functional teams to define the product vision, gather requirements, and prioritize backlogs to align with business objectives and user needs. Your key responsibilities will include defining and communicating the product vision and strategy, leading requirement gathering sessions with stakeholders, collaborating with business leaders and technical teams to gather and prioritize requirements, creating user stories and acceptance criteria, participating in sprint planning, establishing key performance indicators, identifying and mitigating risks, and fostering a culture of continuous improvement through feedback collection and iteration on product features and processes. To be successful in this role, you should have 10-12 years of experience in product management or product ownership, particularly in data migration or cloud projects. You must possess a strong understanding of Google Cloud Platform (GCP) services such as BigQuery, Cloud Storage, and Data Transfer Services, as well as experience with data migration strategies and tools including ETL processes and data integration methodologies. Proficiency in Agile methodologies, excellent analytical and problem-solving skills, strong communication skills, and a Bachelor's degree in Computer Science, Information Technology, Business, or a related field are essential qualifications. Additionally, experience with data governance and compliance in cloud environments, familiarity with project management and collaboration tools like JIRA and Confluence, understanding of data architecture and database management, and Google Cloud certifications such as Professional Cloud Architect and Professional Data Engineer are considered good to have qualifications. At Clairvoyant, we provide opportunities for engineers to develop and grow, work with a team of hardworking and dedicated peers, and offer growth and mentorship opportunities. We value diversity and encourage individuals with varying skills and qualities to apply, as we believe there might be a suitable role for you in the future. Join us in driving innovation and growth in the technology consulting and services industry!,
Posted 3 weeks ago
12.0 - 17.0 years
27 - 35 Lacs
Madurai, Chennai
Work from Office
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: GCP Data Architect Location: Madurai/Chennai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities: Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications: GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer: Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership
Posted 3 weeks ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Google BigQuery. Experience: 5-8 Years.
Posted 3 weeks ago
1.0 - 4.0 years
2 - 6 Lacs
Gurugram
Work from Office
- Desired technical and interpersonal skills include, but are not limited to: 1.BE with hands on experience in Cisco technologies 2.CCNA and/or CCNP Routing & Switching certifications (preferred) 3.Strong communication skills 4.Very Good understanding on Cisco Architectures (EN/Sec/SP) and Solutions 5.Desire and ability to learn new technology and solutions. Specialized experience requirements:- 6+ Years of experience on any one EN/Sec/SP Architecture Understanding and hands on experience preferred in the detailed sub technologies in that Architecture Ability to understand and capture technical as well as business requirements Self-starter with excellent Presentation skills and consultative skills Strong Analytical, Communication both written and verbal Business
Posted 3 weeks ago
4.0 - 5.0 years
10 - 12 Lacs
Bengaluru
Work from Office
Position Overview We are seeking an experienced Full Stack Developer to join our dynamic development team. The ideal candidate will have strong expertise in modern web technologies, cloud platforms, and data analytics tools to build scalable, high-performance applications. Key Responsibilities Frontend Development Develop responsive and interactive user interfaces using React.js and modern JavaScript (ES6+) Implement state management solutions using Redux, Context API, or similar frameworks Ensure cross-browser compatibility and optimize applications for performance Collaborate with UX/UI designers to translate mockups into functional components Write clean, maintainable, and well-documented frontend code Backend Development Design and develop RESTful APIs and microservices using Node.js and Express.js Implement authentication and authorization mechanisms (JWT, OAuth) Build real-time applications using WebSocket or Socket.io Optimize server-side performance and handle database operations efficiently Develop and maintain middleware for logging, error handling, and security Database Management Design and optimize SQL database schemas (PostgreSQL, MySQL, or SQL Server) Write complex queries, stored procedures, and database functions Implement data migration scripts and maintain database integrity Monitor database performance and implement optimization strategies Big Data & Analytics Develop data pipelines and ETL processes using Google BigQuery Create and optimize complex SQL queries for large datasets Implement data visualization solutions and reporting dashboards Work with streaming data and batch processing workflows Cloud Infrastructure Deploy and manage applications on Azure or Google Cloud Platform (GCP) Implement CI/CD pipelines using cloud-native tools Configure and manage cloud databases, storage solutions, and networking Monitor application performance and implement auto-scaling solutions Ensure security best practices and compliance requirements Required Qualifications Technical Skills 4-5 years of professional experience in full stack development Proficiency in Node.js, Express.js, and JavaScript/TypeScript Strong experience with React.js, HTML5, CSS3, and modern frontend frameworks Solid understanding of SQL databases and query optimization Hands-on experience with Google BigQuery for data analytics Experience with cloud platforms (Azure or GCP) including deployment and management Knowledge of version control systems (Git) and collaborative development workflows Additional Requirements Experience with containerization technologies (Docker, Kubernetes) Understanding of microservices architecture and API design principles Familiarity with testing frameworks (Jest, Mocha, Cypress) Knowledge of security best practices and data protection regulations Experience with monitoring and logging tools (Application Insights, Stackdriver)
Posted 3 weeks ago
6.0 - 10.0 years
14 - 19 Lacs
Pune
Work from Office
Project description The Mapping Developer will be responsible for implementing business requests in the financial messaging domain, with a strong focus on ISO 20022, SWIFT FIN, and other market formats. The role involves data conversion using a low-code workbench, collaborating with stakeholders, and ensuring high-quality software solutions through best engineering practices. The position requires close coordination with other teams to support messaging data flows across the bank. Responsibilities Take ownership of business requirements from analysis through to implementation. Implement data conversion logic using a low-code workbench applied across multiple applications and services. Collaborate with stakeholders to gather requirements and deliver precise, well-tested solutions. Apply modern software engineering principles including Git version control, unit testing, and CI/CD deployments. Define and execute unit tests to maintain high code quality. Analyze and resolve issues in test and production environments on a priority basis. Coordinate with other teams to enable smooth and accurate messaging data flows within the bank Skills Must have 6-10 years of experience in IT as a Technical Analyst, Data Modeler, or similar role. Hands-on experience with Core Java development and software engineering practices. Proficient in analyzing and modeling complex data structures and requirements. Understanding of basic programming conceptsvariables, conditions, operators, loops, etc. Familiarity with XML data and XSD schemas. Knowledge of Git and CI/CD tools for code versioning and deployment. Strong attention to detail and ability to deliver high-quality, testable code. Nice to have Experience working with financial messaging standards such as SWIFT FIN and ISO 20022. Exposure to low-code platforms used for data conversion. Ability to work in cross-functional teams and contribute to collaborative environments. Strong problem-solving skills and ability to perform ad hoc issue analysis.
Posted 3 weeks ago
5.0 - 10.0 years
6 - 16 Lacs
Kolkata, Chennai, Bengaluru
Work from Office
Responsibilities A day in the life of an Infoscion • As part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. • You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design • You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines • Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: • Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability • Good knowledge on software configuration management systems • Awareness of latest technologies and Industry trends • Logical thinking and problem solving skills along with an ability to collaborate • Understanding of the financial processes for various types of projects and the various pricing models available • Ability to assess the current processes, identify improvement areas and suggest the technology solutions • One or two industry domain knowledge • Client Interfacing skills • Project and Team management Technical and Professional Requirements: Technology->Cloud Platform->GCP Data Analytics->Looker,Technology->Cloud Platform->GCP Database->Google BigQuery Preferred Skills: Technology->Cloud Platform->Google Big Data Technology->Cloud Platform->GCP Data Analytics
Posted 3 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Were looking for a Senior Data Analyst to join our data-driven team at an ad-tech company that thrives on turning complexity into clarity. Our analysts play a critical role in transforming raw, noisy data into accurate, actionable signals that drive real-time decision-making and long-term strategy. Youll work closely with product, engineering, and business teams to uncover insights, shape KPIs, and guide performance optimization. Responsibilities: Analyze large-scale datasets from multiple sources to uncover actionable insights and drive business impact. Design, monitor, and maintain key performance indicators (KPIs) across ad delivery, bidding, and monetization systems. Partner with product, engineering, and operations teams to define metrics, run deep-dive analyses, and influence strategic decisions. Develop and maintain dashboards, automated reports, and data pipelines to ensure data accessibility and accuracy. Lead investigative analysis of anomalies or unexpected trends in campaign performance, traffic quality, or platform behavior. Requirements BA / BSc in Industrial Engineering and Management / Information Systems Engineering / Economics / Statistics / Mathematics / similar background. 3+ years of experience in Data Analysis and interpretation (Marketing/ Business/ Product). High proficiency in SQL. Experience with data visualization of large data sets using BI systems (Qlik Sense, Sisense, Tableau, Looker, etc.). Experience working with data warehouse/data lake tools like Athena / Redshift / Snowflake /BigQuery. Knowledge of Python - An advantage. Experience building ETL processes An advantage. Fluent in English both written and spoken - Must
Posted 3 weeks ago
5.0 - 8.0 years
9 - 14 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Google BigQuery. Experience: 5-8 Years.
Posted 3 weeks ago
3.0 - 7.0 years
6 - 16 Lacs
Chennai
Hybrid
Greetings from Getronics! We have permanent opportunities for GCP Data Engineers for Chennai Location . Hope you are doing well! This is Jogeshwari from Getronics Talent Acquisition team. We have multiple opportunities for GCP Data Engineers. Please find below the company profile and Job Description. If interested, please share your updated resume, recent professional photograph and Aadhaar proof at the earliest to jogeshwari.k@getronics.com. Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 3+ Years in IT and minimum 2+ years in GCP Data Engineering Location : Chennai Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 6+ years of professional experience: Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Regards, Jogeshwari Senior Specialist
Posted 3 weeks ago
3.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Mandatory Skills: Google BigQuery. Experience: 3-5 Years.
Posted 3 weeks ago
3.0 - 5.0 years
13 - 17 Lacs
Gurugram
Work from Office
Senior Analyst-GCP Data Engineer: Elevate Your Impact Through Innovation and Learning Evalueserve is a global leader in delivering innovative and sustainable solutions to a diverse range of clients, including over 30% of Fortune 500 companies. With a presence in more than 45 countries across five continents, we excel in leveraging state-of-the-art technology, artificial intelligence, and unparalleled subject matter expertise to elevate our clients' business impact and strategic decision-making. Our team of over 4, 500 talented professionals operates in countries such as India, China, Chile, Romania, the US, and Canada. Our global network also extends to emerging markets like Colombia, the Middle East, and the rest of Asia-Pacific. Recognized by Great Place to Work in India, Chile, Romania, the US, and the UK in 2022, we offer a dynamic, growth-oriented, and meritocracy-based culture that prioritizes continuous learning and skill development and work-life balance. About Data Analytics (DA) Data Analytics is one of the highest growth practices within Evalueserve, providing you rewarding career opportunities. Established in 2014, the global DA team extends beyond 1000+ (and growing) data science professionals across data engineering, business intelligence, digital marketing, advanced analytics, technology, and product engineering. Our more tenured teammates, some of whom have been with Evalueserve since it started more than 20 years ago, have enjoyed leadership opportunities in different regions of the world across our seven business lines. What you will be doing at Evalueserve Data Pipeline Development: Design and implement scalable ETL (Extract, Transform, Load) pipelines using tools like Cloud Dataflow, Apache Beam or Spark and BigQuery. Data Integration: Integrate various data sources into unified data warehouses or lakes, ensuring seamless data flow. Data Transformation: Transform raw data into analyzable formats using tools like dbt (data build tool) and Dataflow. Performance Optimization: Continuously monitor and optimize data pipelines for speed, scalability, and cost-efficiency. Data Governance: Implement data quality standards, validation checks, and anomaly detection mechanisms. Collaboration: Work closely with data scientists, analysts, and business stakeholders to align data solutions with organizational goals. Documentation: Maintain detailed documentation of workflows and adhere to coding standards. What were looking for Proficiency in **Python/PySpark and SQL for data processing and querying. Expertise in GCP services like BigQuery, Cloud Storage, Pub/Sub, Cloud composure and Dataflow. Familiarity with Datawarehouse and lake house principles and distributed data architectures. Strong problem-solving skills and the ability to handle complex projects under tight deadlines. Knowledge of data security and compliance best practices. Certification: GCP Professional Data engineer. Follow us on https://www.linkedin.com/compan y/evalueserve/ Click here to learn more about what our Leaders talking onachievements AI-poweredsupply chain optimization solution built on Google Cloud. HowEvalueserve isnow Leveraging NVIDIA NIM to enhance our AI and digital transformationsolutions and to accelerate AI Capabilities . Knowmore about how Evalueservehas climbed 16 places on the 50 Best Firms for Data Scientists in 2024! Want to learn more about our culture and what its like to work with us? Write to us at: careers@evalueserve.com Disclaimer : The following job description serves as an informative reference for the tasks you may be required to perform. However, it does not constitute an integral component of your employment agreement and is subject to periodic modifications to align with evolving circumstances. Please Note: We appreciate the accuracy and authenticity of the information you provide, as it plays a key role in your candidacy. As part of the Background Verification Process, we verify your employment, education, and personal details. Please ensure all information is factual and submitted on time. For any assistance, your TA SPOC is available to support you.
Posted 3 weeks ago
5.0 - 9.0 years
9 - 18 Lacs
Bengaluru
Hybrid
Job Description 5+ yrs of IT experience Good understanding of analytics tools for effective analysis of data Should be able to lead teams Should have been part of the production deployment team, Production Support team Experience with Big Data tools- Hadoop, Spark, Apache Beam, Kafka, etc. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc. Experience with any DW tools like BQ, Redshift, Synapse, or Snowflake Experience in ETL and Data Warehousing. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra, etc. Experience with cloud platforms like AWS, GCP and Azure. Experience with workflow management using tools like Apache Airflow. Roles & Responsibilities Develop high-performance and scalable solutions using GCP that extract, transform, and load big data. Designing and building production-grade data solutions from ingestion to consumption using Java / Python Design and optimize data models on the GCP cloud using GCP data stores such as BigQuery Should be able to handle the deployment process Optimizing data pipelines for performance and cost for large-scale data lakes. Writing complex, highly optimized queries across large data sets and creating data processing layers. Closely interact with Data Engineers to identify the right tools to deliver product features by performing POC Collaborative team player that interacts with business, BAs, and other Data/ML engineers Research new use cases for existing data. Preferred: Need to be Aware of Design Best practices for OLTP and OLAP Systems Should be part of team designing the DB and pipeline Should have exposure to Load testing methodologies, Debugging pipelines and Delta load handling Worked on heterogeneous migration projects
Posted 3 weeks ago
2.0 - 5.0 years
6 - 16 Lacs
Pune
Work from Office
Join Samruddh Bharat as a Jr. PHP Laravel Developer to build scalable apps with Laravel, Node.js, Vue.js & GCP. Full-time, in-office role.
Posted 3 weeks ago
5.0 - 10.0 years
25 - 30 Lacs
Chennai
Work from Office
Job Summary: We are seeking a highly skilled Data Engineer to design, develop, and maintain robust data pipelines and architectures. The ideal candidate will transform raw, complex datasets into clean, structured, and scalable formats that enable analytics, reporting, and business intelligence across the organization. This role requires strong collaboration with data scientists, analysts, and cross-functional teams to ensure timely and accurate data availability and system performance. Key Responsibilities Design and implement scalable data pipelines to support real-time and batch processing. Develop and maintain ETL/ELT processes that move, clean, and organize data from multiple sources. Build and manage modern data architectures that support efficient storage, processing, and access. Collaborate with stakeholders to understand data needs and deliver reliable solutions. Perform data transformation, enrichment, validation, and normalization for analysis and reporting. Monitor and ensure the quality, integrity, and consistency of data across systems. Optimize workflows for performance, scalability, and cost-efficiency. Support cloud and on-premise data integrations, migrations, and automation initiatives. Document data flows, schemas, and infrastructure for operational and development purposes. • Apply best practices in data governance, security, and compliance. Required Qualifications & Skills: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Proven 6+ Years experience in data engineering, ETL development, or data pipeline management. Proficiency with tools and technologies such as: SQL, Python, Spark, Scala ETL tools (e.g., Apache Airflow, Talend) Cloud platforms (e.g., AWS, GCP, Azure) Big Data tools (e.g., Hadoop, Hive, Kafka) Data warehouses (e.g., Snowflake, Redshift, BigQuery) Strong understanding of data modeling, data architecture, and data lakes. Experience with CI/CD, version control, and working in Agile environments. Preferred Qualifications: • Experience with data observability and monitoring tools. • Knowledge of data cataloging and governance frameworks. • AWS/GCP/Azure data certification is a plus.
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
hyderabad, telangana
On-site
The Specialist of Digital Experience Performance will play a crucial role in the Inspire Brands Digital Experience Performance (DXP) team by analyzing insights on digital user experience from various sources. Focusing on Inspire Brands ecommerce websites/apps, this position involves reporting findings, visualizing trends, and assessing feature performance using quantitative and qualitative tools. Working closely with the DXP team, you will contribute significantly to enhancing and expanding Inspire's digital solutions. Collaborating with your Manager and other team members, you will create insightful reports to aid in storytelling and optimization development, supporting the achievement of our long-term roadmap and priorities. Key responsibilities include: - Monitoring core performance metrics and adjusting reports to effectively communicate performance changes - Developing and maintaining a suite of reports for the DXP team and stakeholders using data visualization tools and query writing - Providing feature performance insights to brand-focused DXP team members to explain the impact on the ecommerce experience and business - Building a comprehensive understanding of user experiences on Inspire Brands" digital platforms - Assisting business stakeholders with ad hoc reporting to guide product enhancements and business objectives - Cultivating strong relationships with key stakeholders and partners within the DXP team and across Inspire Qualifications: - Bachelor's degree and a minimum of 2 years of experience in analytics reporting and/or data visualization - Proficiency in data visualization tools like Tableau, Power BI, etc. - Ability to write queries in SQL, BigQuery, or similar tools to address business requirements - Familiarity with Digital Analytics tools such as Google Analytics (knowledge of Adobe Analytics, Mix Panel is a plus) - Strong project and program management skills to handle multiple projects concurrently with tight deadlines - Excellent communication skills and conflict resolution abilities - Capacity to thrive in a fast-paced, collaborative environment, managing complex assignments across various stakeholders - Capability to work independently and as part of a team, demonstrating exceptional communication and collaboration skills.,
Posted 3 weeks ago
7.0 - 12.0 years
18 - 30 Lacs
Bengaluru
Remote
Job Overview: We are seeking a highly skilled Java Engineer with hands-on experience in developing, understanding the existing code and troubleshooting Java code. The ideal candidate will be a strong team player with the ability to work cross-functionally in a fast-paced, collaborative environment. Experience with modern Java frameworks, containerized environments, and AI-enhanced protocols is a plus. Roles and Responsibilities: Read, analyze, and develop clean, efficient Java code. Work with the Spring Framework and implement solutions following the MVC model . Build and manage projects using Maven . Deploy and maintain applications on Tomcat Application Server . Develop applications using Java/JDK 17 . Build and run containerized applications using Docker . Collaborate with multiple internal teams to address tickets, troubleshoot issues, and ensure smooth delivery of software solutions. Participate in code reviews, daily standups, and agile ceremonies. Troubleshoot production and application issues in a timely manner. Must-Have Skills: Strong proficiency in Java (including reading and writing complex code). Experience with Spring Framework and MVC architecture . Experience in Big Query and GCP. Proficient in Maven , Tomcat , and Java 17 . Hands-on experience with Docker . Excellent problem-solving and troubleshooting skills. Ability to communicate effectively and collaborate with cross-functional teams. Preferred/Bonus: Understanding of Model Context Protocol or experience with AI-enabled platforms . Familiarity with microservices architecture and REST APIs. Soft Skills: Self-starter with strong ownership and accountability. Excellent communication and interpersonal skills. Strong attention to detail and a collaborative mindset.
Posted 3 weeks ago
5.0 - 10.0 years
10 - 12 Lacs
Navi Mumbai
Work from Office
Hello Candidates , We are Hiring !! Job Position - Data Engineer Experience - 5+ years Location - NAVI MUMBAI ( Juinagar ) Work Mode - WFO Job Description We are looking for an experienced and results-driven Senior Data Engineer to join our Data Engineering team. In this role, you will design, develop, and maintain robust data pipelines and infrastructure that enable efficient data flow across our systems. As a senior contributor, you will also help define best practices, mentor junior team members, and contribute to the long-term vision of our data platform. You will work closely with cross-functional teams to deliver reliable, scalable, and high-performance data systems that support critical business intelligence and analytics initiatives. Responsibilities Design, build, and maintain scalable ETL/ELT pipelines to support analytics, Data Warehouse, and business operations. Collaborate with cross-functional teams to gather requirements and deliver high-quality data solutions. Develop and manage data models, data lakes, and data warehouse solutions in cloud environments (e.g., AWS, Azure, GCP). Monitor and optimize the performance of data pipelines and storage systems. Ensure data quality, integrity, and security across all platforms. Optimize and tune SQL queries and ETL jobs for performance and scalability. Collaborate with business analysts, data scientists, and stakeholders to understand requirements and deliver data solutions. Contribute to architectural decisions and development standards across the data engineering team. Participate in code reviews and provide guidance to junior developers. Leverage tools such as Airflow, Spark, Kafka, dbt, or Snowflake to build modern data infrastructure. Ensure data accuracy, completeness, and integrity across systems. Implement best practices in data governance, security, and compliance (e.g., GDPR, HIPAA). Mentor junior developers and participate in peer code reviews. Create and maintain detailed technical documentation. Required Qualifications Bachelors degree in Computer Science, Information Systems, or a related field; Masters degree is a plus. 5+ years of experience in data warehousing, ETL development, and data modeling. Strong hands-on experience with one or more databases: Snowflake, Redshift, SQL Server, Oracle, Postgres, Teradata, BigQuery. Proficiency in SQL and scripting languages (e.g., Python, Shell). Deep knowledge of data modeling techniques and ETL frameworks. Excellent communication, analytical thinking, and troubleshooting skills. Preferred Qualifications Experience with modern data stack tools like dbt, Fivetran, Stitch, Looker, Tableau, or Power BI. Knowledge of data lakes, lakehouses, and real-time data streaming (e.g., Kafka). Agile/Scrum project experience and version control using Git. NOTE - Candidates can share their Resume - shruti.a@talentsketchers.com
Posted 3 weeks ago
4.0 - 7.0 years
0 Lacs
Pune
Hybrid
Job Title: GCP Data Engineer Location: Pune, India Experience: 4 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 4 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging Dataproc , BigQuery , Cloud Storage , and Pub/Sub. Develop and manage ETL/ELT workflows using Apache Spark , SQL , and Python. Orchestrate and automate data workflows using Cloud Composer (Apache Airflow). Build batch and streaming data processing jobs that integrate data from various structured and unstructured sources. Optimize pipeline performance and ensure cost-effective data processing. Collaborate with data analysts, scientists, and business teams to understand data requirements and deliver high-quality solutions. Implement and monitor data quality checks, validation, and transformation logic. Required Skills: Strong hands-on experience with Google Cloud Platform (GCP) Proficiency with Dataproc for big data processing and Apache Spark Expertise in Python and SQL for data manipulation and scripting Experience with Cloud Composer / Apache Airflow for workflow orchestration Knowledge of data modeling, warehousing, and pipeline best practices Solid understanding of ETL/ELT architecture and implementation Strong troubleshooting and problem-solving skills Preferred Qualifications: GCP Data Engineer or Cloud Architect Certification. Familiarity with BigQuery , Dataflow , and Pub/Sub. Experience with CI/CD and DevOps tools in data engineering workflows. Exposure to Agile methodologies and team collaboration tools.
Posted 3 weeks ago
10.0 - 14.0 years
12 - 17 Lacs
Bengaluru
Work from Office
About The Role Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Assoc Mgr Qualifications: Any Graduation/Post Graduate Diploma in Management Years of Experience: 10 to 14 years Language - Ability: English(Domestic) - Advanced About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationUnderstanding of foundational principles and knowledge of Artificial Intelligence AI including concepts, techniques, and tools in order to use AI effectively. What are we looking for Python (Programming Language)Python Software DevelopmentPySparkMicrosoft SQL ServerMicrosoft SQL Server Integration Services (SSIS)Ability to work well in a teamWritten and verbal communicationNumerical abilityResults orientation3 CL8 Prompt Engineers2 CL8 Data Engineers Roles and Responsibilities: In this role you are required to do analysis and solving of moderately complex problems Typically creates new solutions, leveraging and, where needed, adapting existing methods and procedures The person requires understanding of the strategic direction set by senior management as it relates to team goals Primary upward interaction is with direct supervisor or team leads Generally interacts with peers and/or management levels at a client and/or within Accenture The person should require minimal guidance when determining methods and procedures on new assignments Decisions often impact the team in which they reside and occasionally impact other teams Individual would manage medium-small sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation,Post Graduate Diploma in Management
Posted 3 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with stakeholders to understand business needs and translating them into functional design solutions. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements.- Design and develop applications that align with business processes.- Implement best practices for application design and development.- Conduct code reviews and provide technical guidance to team members.- Stay updated on industry trends and technologies to enhance application design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data modeling and database design.- Experience with cloud-based data warehousing solutions.- Hands-on experience in ETL processes and data integration.- Knowledge of SQL and query optimization techniques. Additional Information:- The candidate should have a minimum of 3 years of experience in Google BigQuery.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Navi Mumbai
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while ensuring that all development aligns with best practices and organizational standards. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery.- Strong understanding of data warehousing concepts and ETL processes.- Experience with SQL and database management.- Familiarity with cloud computing platforms and services.- Ability to troubleshoot and optimize application performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Google BigQuery.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
12.0 - 15.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : MySQL, Python (Programming Language), Google BigQueryMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with various stakeholders to gather requirements, developing application features, and ensuring that the applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking opportunities for improvement and innovation in application design and functionality. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark.- Good To Have Skills: Experience with MySQL, Python (Programming Language), Google BigQuery.- Strong understanding of distributed computing principles.- Experience with data processing frameworks and tools.- Familiarity with cloud platforms and services. Additional Information:- The candidate should have minimum 12 years of experience in Apache Spark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL)Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the team in implementing new technologies- Conduct regular code reviews to ensure quality standards are met Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL)- Strong understanding of data warehousing concepts- Experience in optimizing query performance in large datasets- Knowledge of ETL processes and data modeling Additional Information:- The candidate should have a minimum of 7.5 years of experience in Google BigQuery- This position is based at our Mumbai office- A 15 years full-time education is required Qualification 15 years full time education
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough