Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 18.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer with 12 to 18 years of experience, you will be responsible for working remotely on a 3-month extendable project focusing on Data Warehousing (DWH), ETL, GCP, and CDP as an Architect. Your role will involve a deep understanding of customer data models, behavioral analytics, segmentation, and machine learning models. You should have expertise in APIs integration, real-time event processing, and data pipelines. The ideal candidate will have prior experience in ETL and DWH, along with a strong background in designing and implementing solutions in cloud environments like GCP and Google CDP data platforms such as Snowflake and BigQuery. Experience in developing customer-facing user interfaces using BI Tools like Google Looker, Power BI, or other open-source tools is essential. You should have a track record of Agile delivery, be self-motivated, and possess strong communication and interpersonal skills. As a motivated self-starter, you should be adept at adapting to changing priorities and be able to think quickly to design and deliver effective solutions. To excel in this role, you should ideally have experience as a Segment CDP platform developer and a minimum of 15-18 years of relevant experience with a degree in B.Tech/MCA/M.Tech. If you are looking for a challenging opportunity to leverage your expertise in data engineering, analytics, and cloud platforms, this role offers an exciting prospect to contribute to a dynamic project.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow - people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. This position provides leadership in full systems life cycle management to ensure delivery is on time and within budget. You will be responsible for directing component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. Additionally, you will develop and lead AD project activities and integrations, guide teams to ensure effective communication and achievement of objectives, research and support the integration of emerging technologies, and provide knowledge and support for applications development, integration, and maintenance. Leading junior team members with project-related activities and tasks, guiding and influencing department and project teams, and facilitating collaboration with stakeholders are also key aspects of this role. Primary Skills: - C#, .Net - Web-Development (Angular) - Database(SQL/PLSQL) - ReST API (Micro/Webservices) - Cloud apps experience & DevOps(CI/CD) Secondary skills: - Google Cloud Platform - GKE, Apigee, BigQuery, Spanner etc. - Agile experience - PowerBI - Experience in Java apps Responsibilities: - Leads systems analysis and design. - Leads design and development of applications. - Develops and ensures creation of application documents. - Defines and produces integration builds. - Monitors emerging technology trends. - Leads maintenance and support. Qualifications: - Bachelors Degree or International equivalent - Bachelor's Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred Employee Type: Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Data Warehouse (DWH) professional with relevant experience in Google Cloud Platform (GCP), you will be responsible for developing and implementing robust data architectures. This includes designing data lakes, data warehouses, and data marts by utilizing GCP services such as BigQuery, Dataflow, DataProc, and Cloud Storage. Your role will involve designing and implementing data models that meet business requirements while ensuring data integrity, consistency, and accessibility. Your deep understanding of GCP services and best practices for data warehousing, data analytics, and machine learning will be crucial in this role. You will also be tasked with planning and executing data migration strategies from on-premises or other cloud environments to GCP. Optimizing data pipelines and query performance to facilitate efficient data processing and analysis will be a key focus area. Additionally, your proven experience in managing teams and project delivery will be essential for success in this position. Collaborating closely with stakeholders to comprehend their requirements and deliver effective solutions will be a significant part of your responsibilities. Any experience with Looker will be considered advantageous for this role.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
karnataka
On-site
Sabre is a technology company that powers the global travel industry. By leveraging next-generation technology, we create global technology solutions that take on the biggest opportunities and solve the most complex challenges in travel. Positioned at the center of the travel industry, we shape the future by offering innovative advancements that pave the way for a more connected and seamless ecosystem. Our solutions power mobile apps, online travel sites, airline and hotel reservation networks, travel agent terminals, and many other platforms, connecting people with moments that matter. Sabre is seeking a talented senior software engineer full Senior Data Science Engineer for SabreMosaic Team. In this role, you will plan, design, develop, and test data science and data engineering software systems or applications for software enhancements and new products based on cloud-based solutions. Role and Responsibilities: - Develop, code, test, and debug new complex data-driven software solutions or enhancements to existing products. - Design, plan, develop, and improve applications using advanced cloud-native technology. - Work on issues requiring in-depth knowledge of organizational objectives and implement strategic policies in selecting methods and techniques. - Encourage high coding standards, best practices, and high-quality output. - Interact regularly with subordinate supervisors, architects, product managers, HR, and others on project or team performance matters. - Provide technical mentorship and cultural/competency-based guidance to teams. - Offer larger business/product context and mentor on specific tech stacks/technologies. Qualifications and Education Requirements: - Minimum 4-6 years of related experience as a full-stack developer. - Expertise in Data Engineering/DW projects with Google Cloud-based solutions. - Designing and developing enterprise data solutions on the GCP cloud platform. - Experience with relational databases and NoSQL databases like Oracle, Spanner, BigQuery, etc. - Expert-level SQL skills for data manipulation, validation, and manipulation. - Experience in designing data modeling, data warehouses, data lakes, and analytics platforms on GCP. - Expertise in designing ETL data pipelines and data processing architectures for Datawarehouse. - Strong experience in designing Star & Snowflake Schemas and knowledge of Dimensional Data Modeling. - Collaboration with data scientists, data teams, and engineering teams using Google Cloud platform for data analysis and data modeling. - Familiarity with integrating datasets from multiple sources for data modeling for analytical and AI/ML models. - Understanding and experience in Pub/Sub, Kafka, Kubernetes, GCP, AWS, Hive, Docker. - Expertise in Java Spring Boot / Python or other programming languages used for Data Engineering and integration projects. - Strong problem-solving and analytical skills. - Exposure to AI/ML, MLOPS, and Vertex AI is an advantage. - Familiarity with DevOps practices like CICD pipeline. - Airline domain experience is a plus. - Excellent spoken and written communication skills. - GCP Cloud Data Engineer Professional certification is a plus. We will carefully consider your application and review your details against the position criteria. Only candidates who meet the minimum criteria for the role will proceed in the selection process.,
Posted 3 days ago
3.0 - 8.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen , we understand that your experience and expertise are relevant the current open with our clients. About Technogen : https://technogenindia.com/ Technogen India Pvt. Ltd. is a boutique Talent & IT Solutions company, founded in 2008, has been serving global customers for over last 2 decades,. Talent Solutions: We assist several GCCs, Global MNCs and IT majors on their critical and unique IT talent needs through our services around Recruitment Process Outsourcing (RPO), contract staffing, permanent hiring, Hire-Train-Deploy (HTD), Build-Operate-Transfer (BOT) and Offshore staffing. Please share below details for further processing of your profile. Total years of experience: Relevant years of experience: CTC (Including Variable): ECTC: Notice Period: Reason for change: Current location: Job Title : Data Tester || ETL Tester Experience : 3+ years Work Mode: WFO-4 Days from Office. Shift Time : UK Shift Time-12:00 PM IST to 09:00 PM IST. Location : Hyderabad. Job Summary:- What Were Looking For:- Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related technical field. Minimum of 3 years of experience in data testing, ETL/ELT testing, or data quality roles. Hands-on experience with Google BigQuery or other cloud-based data platforms. Proficiency in writing complex SQL queries for data validation and testing. Strong understanding of data quality, integrity, and validation techniques. Experience handling large datasets and complex data structures. Familiarity with test automation tools such as Selenium or TestNG is a plus. Working knowledge of ETL/ELT pipeline development and cloud-native data architecture. Solid grasp of data warehousing concepts, methodologies, and data governance frameworks. Familiarity with scripting languages such as Python or Shell for data validation or automation tasks. Excellent communication skills—both written and verbal. Ability to thrive in a fast-paced, collaborative, and Agile environment. Experience with Agile methodologies (Scrum, Kanban) and managing IT backlogs. A proactive problem solver and a go-to expert for data quality and testing. What Your Impact Will Be:- Test and validate ETL/ELT processes to ensure data quality, completeness and accuracy. Perform data reconciliation and verification between source systems and GBQ (Data Lake) or other target systems. Execute manual and automated test cases based on ETL specifications and business requirements to validate data accuracy and integrity Identify, log, and track defects, working closely with the development team to resolve issues. Document test results and report defects, working closely with the development team to facilitate timely resolution Collaborate with data engineers and business analysts to understand requirements and business rules. Ensure data transformation rules are correctly applied. Perform data profiling and integrity checks to ensure compliance with business requirements. Support the creation of test strategies and plans for data-related projects. Verify the BigQuery best practices—including partitioning, clustering, and query optimization—to ensure high performance and scalability. As part of the Global Technology Organization, the Enterprise Data and Analytics (ED&A) Delivery Team enables to become a data-driven organization by building a centralized, shared services platform that supports all business units. The team is responsible for:- Integrating data from diverse enterprise systems including ERP, E-commerce, Order Management, CRM, and operational platforms into a centralized, cloud-based data warehouse. Building robust ETL/ELT pipelines to automate data ingestion, transformation, and delivery using modern tools and frameworks. Delivering curated, business-ready datasets to support self-service analytics and strategic decision-making. Enforcing enterprise-wide data quality, testing, and governance standards. Leveraging orchestration tools such as Airflow/Cloud Composer for reliable pipeline scheduling and execution. Collaborating with data analysts, product owners, and BI developers to align data solutions with business needs. Best Regards, Syam.M | Sr.IT Recruiter syambabu.m@technogenindia.com www.technogenindia.com | Follow us on LinkedIn
Posted 3 days ago
4.0 - 9.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen, we understand that your experience and expertise are relevant the current open with our clients. About Technogen : TechnoGen Brief Overview:- TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ISO 27001:2013, and CMMI Level 3 Global IT Services Company headquartered in Chantilly, Virginia. TechnoGen, Inc. (TGI) is a Minority & Women-Owned Small Business with over 20 years of experience providing end-to-end IT Services and Solutions to the Public and Private sectors. TGI provides highly skilled and certied professionals and has successfully executed more than 345 projects. TechnoGen is committed to helping our clients solve complex problems and achieve their goals, on time and under budget. LinkedIn: https://www.linkedin.com/company/technogeninc/about/ Job Title :Data Engineer IT Quality Required Experience : 4+ years Location : Hyderabad. JD summary: Job Summary: We are looking for a proactive and technically skilled Data Engineer to lead data initiatives and provide application support for Quality, Consumer Services and Sustainability domains. The Data Engineer in the Quality Area is responsible for designing, developing, and maintaining data integration solutions to support quality processes. This role focuses on leveraging ETL tools such as Informatica Cloud, Ascend, Google Cloud Dataflow, and Composer, along with Python, Spark programming, to ensure seamless data flow, transformation, and integration across quality systems. The position is offsite and requires collaboration with business partners and IT teams to deliver end-to-end data solutions that meet regulatory and business requirements. The candidate must be willing to work on site 4 days a week in Hyderabad, during US EST time zone. Key Responsibilities: Data Integration and ETL Development: Design and implement robust ETL pipelines using tools like Informatica Cloud, Ascend, Google Big Query, Google Cloud Dataflow, and Composer to integrate data from quality systems (e.g., Veeva Vault, QMS, GBQ, PLM, Order Management systems). Develop and optimize data transformation workflows to ensure accurate, timely, and secure data processing. Use Python for custom scripting, data manipulation, and automation of ETL processes. Data Pipeline Support and Maintenance: Monitor, troubleshoot, and resolve issues in data pipelines, ensuring high availability and performance. Implement hotfixes, enhancements, and minor changes to existing ETL workflows to address defects or evolving business needs. Ensure data integrity, consistency, and compliance with regulatory standards Collaboration and Stakeholder Engagement: Work closely with quality teams, business analysts, and IT stakeholders to gather requirements and translate them into technical data solutions. Collaborate with cross-functional teams to integrate quality data with other enterprise systems, such as PLM, QMS, ERP, or LIMS. Communicate effectively with remote teams to provide updates, resolve issues, and align project deliverables. Technical Expertise: Maintain proficiency in ETL tools (Informatica Cloud, Ascend, Dataflow, Composer, GBQ) and Python for data engineering tasks. Design scalable and efficient data models to support quality reporting, analytics, and compliance requirements. Implement best practices for data security, version control, and pipeline orchestration. Documentation and Training: Create and maintain detailed documentation for ETL processes, data flows, and system integrations. Provide guidance and training to junior team members or end-users on data integration processes and tools. Qualifications: Education: Bachelor's degree in computer science, Data Engineering, Information Systems, or a related field Experience: 4+ years of experience as a Data Engineer, with a focus on data integration in quality or regulated environments. Hands-on experience with ETL tools such as Informatica Cloud, Ascend, Google Cloud Dataflow, and Composer. Proficiency in Python for data processing, scripting, and automation. Experience working with Veeva application a plus. Technical Skills: Expertise in designing and optimizing ETL pipelines using Informatica Cloud, Ascend, Dataflow, or Composer. Strong Python programming skills for data manipulation, automation, and integration. Familiarity with cloud platforms (e.g., Google Cloud, AWS, Azure) and data integration patterns (e.g., APIs, REST, SQL). Knowledge of database systems (e.g., SQL Server, Oracle, BigQuery) and data warehousing concepts. Experience with Agile methodologies and tools like JIRA or Azure DevOps. Soft Skills: Excellent communication and collaboration skills to work effectively with remote teams and business partners. Problem-Solving: Strong problem-solving and analytical skills to address complex data integration challenges. Ability to manage multiple priorities and deliver high-quality solutions in a fast-paced environment. Cultural Awareness: Ability to work effectively in a multicultural environment and manage teams across different time zones. Preferred Qualifications: Experience working in regulated environments Advanced degrees or certifications (e.g., Informatica Cloud, Google Cloud Professional Data Engineer) are a plus. Experience with Agile or hybrid delivery models. About Us: We are a leading organization committed to leveraging technology to drive business success. Our team is dedicated to innovation, collaboration, and delivering exceptional results. Join us and be a part of a dynamic and forward-thinking company. How to Apply: Interested candidates are invited to submit their resume and cover letter detailing their relevant experience and qualifications. Best Regards, Syam.M | Sr.IT Recruiter syambabu.m@technogenindia.com www.technogenindia.com | Follow us on LinkedIn
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
noida, uttar pradesh
On-site
At TELUS Digital, you will play a crucial role in enabling customer experience innovation by fostering spirited teamwork, embracing agile thinking, and embodying a caring culture that prioritizes customers. As the global arm of TELUS Corporation, a leading telecommunications service provider in Canada, we specialize in delivering contact center and business process outsourcing solutions to major corporations across various sectors such as consumer electronics, finance, telecommunications, and utilities. With our extensive global call center capabilities, we offer secure infrastructure, competitive pricing, skilled resources, and exceptional customer service, all supported by TELUS, our multi-billion dollar parent company. In this role, you will leverage your expertise in Data Engineering, backed by a minimum of 4 years of industry experience, to drive the success of our projects. Proficiency in Google Cloud Platform (GCP) services including Dataflow, BigQuery, Cloud Storage, and Pub/Sub is essential for effectively managing data pipelines and ETL processes. Your strong command over the Python programming language will be instrumental in performing data processing tasks efficiently. You will be responsible for optimizing data pipeline architectures, enhancing performance, and ensuring reliability through your software engineering skills. Your ability to troubleshoot and resolve complex pipeline issues, automate repetitive tasks, and monitor data pipelines for efficiency and reliability will be critical in maintaining operational excellence. Additionally, your familiarity with SQL, relational databases, and version control systems like Git will be beneficial in streamlining data management processes. As part of the team, you will collaborate closely with stakeholders to analyze, test, and enhance the reliability of GCP data pipelines, Informatica ETL workflows, MDM, and Control-M jobs. Your commitment to continuous improvement, SLA adherence, and post-incident reviews will drive the evolution of our data pipeline systems. Excellent communication, problem-solving, and analytical skills are essential for effectively documenting processes, providing insights, and ensuring seamless operations. This role offers a dynamic environment where you will have the opportunity to work in a 24x7 shift, contributing to the success of our global operations and making a meaningful impact on customer experience.,
Posted 3 days ago
1.0 - 5.0 years
0 Lacs
ahmedabad, gujarat
On-site
As a Data Engineer at Synoptek, you will be responsible for designing, developing, and maintaining robust and scalable data pipelines on the Google Cloud Platform (GCP). You will leverage your hands-on experience with GCP services such as BigQuery, Jitterbit, Cloud Dataflow, Cloud Pub/Sub, and Cloud Storage to build efficient data processing solutions. Collaborating with cross-functional teams, you will translate their data needs into technical requirements, ensuring data quality, integrity, and security throughout the data lifecycle. Your role will involve developing and optimizing ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lakes. Additionally, you will build and maintain data models and schemas to support business intelligence and analytics, while troubleshooting data quality issues and performance bottlenecks. To excel in this position, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with 3 to 4 years of experience as a Data Engineer focusing on GCP. Proficiency in Python, SQL, and BigQuery is essential, as well as hands-on experience with data ingestion, transformation, and loading tools like Jitterbit and Apache Beam. A strong understanding of data warehousing and data lake concepts, coupled with experience in data modeling and schema design, will be beneficial. The ideal candidate will exhibit excellent problem-solving and analytical skills, working both independently and collaboratively with internal and external teams. Familiarity with acquiring and managing data from various sources, as well as the ability to identify trends in complex datasets and propose business solutions, are key attributes for success in this role. At Synoptek, we value employees who embody our core DNA behaviors, including clarity, integrity, innovation, accountability, and a results-focused mindset. We encourage continuous learning, adaptation, and growth in a fast-paced environment, promoting a culture of teamwork, flexibility, respect, and collaboration. If you have a passion for data engineering, a drive for excellence, and a commitment to delivering impactful results, we invite you to join our dynamic team at Synoptek. Work hard, play hard, and let's achieve superior outcomes together.,
Posted 4 days ago
10.0 - 15.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen, we understand that your experience and expertise are relevant the current open with our clients. About Technogen : TechnoGen Brief Overview:- TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ISO 27001:2013, and CMMI Level 3 Global IT Services Company headquartered in Chantilly, Virginia. TechnoGen, Inc. (TGI) is a Minority & Women-Owned Small Business with over 20 years of experience providing end-to-end IT Services and Solutions to the Public and Private sectors. TGI provides highly skilled and certied professionals and has successfully executed more than 345 projects. TechnoGen is committed to helping our clients solve complex problems and achieve their goals, on time and under budget. LinkedIn: https://www.linkedin.com/company/technogeninc/about/ Job Title :Mar Tech Lead Software Engineer Required Experience : 10+ years Location : Hyderabad. Job Summary :- Strong knowledge in cloud computing platforms - Google Cloud Expertise in MySQL & SQL/PL Good Experience in IICS Experience in ETL Ascend IO is added advantage GCP & BigQuery knowledge is must, GCP certification is added advantage Good experience in Google Cloud Storage (GCS), Cloud Composer, DAGs , Airflow REST API development experience Good in analytical and problem solving, efficient communication Experience in designing, implementing, and managing various ETL job execution flows. Utilize Git for source version control. Set up and maintain CI/CD pipelines. Troubleshoot, debug, and upgrade existing application & ETL job chains. Comprehensive data analysis across complex data sets Ability to collaborate effectively across technical development teams and business departments Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. 5+ years of experience in data engineering or related roles Strong understanding of Google Cloud Platform and associated tools. Proven experience in delivering consumer marketing data and analytics solutions for enterprise clients. Strong knowledge of data management, ETL processes, data warehousing, and analytics platforms. Experience with SQL and NoSQL databases. Proficiency in Python programming languages. Hands-on experience with data warehousing solutions Knowledge of marketing analytics tools and technologies, including but not limited to Google Analytics, Blueconic, Klaviyo, etc. Knowledge of performance marketing concepts such as targeting & segmentation, real-time optimization, A/B testing, attribute modeling, etc. Excellent communication skills with a track record of collaboration across multiple teams Strong collaboration skills and team-oriented mindset. Strong problem-solving skills, adaptability, and the ability to thrive in a dynamic and rapidly changing environment. Best Regards, Syam.M | Sr.IT Recruiter syambabu.m@technogenindia.com www.technogenindia.com | Follow us on LinkedIn
Posted 4 days ago
5.0 - 10.0 years
10 - 20 Lacs
Hyderabad
Work from Office
Greetings from Technogen !!! We thank you for taking time about your competencies and skills, while allowing us an opportunity to explain about us and our Technogen, we understand that your experience and expertise are relevant the current open with our clients. About Technogen : TechnoGen Brief Overview:- TechnoGen, Inc. is an ISO 9001:2015, ISO 20000-1:2011, ISO 27001:2013, and CMMI Level 3 Global IT Services Company headquartered in Chantilly, Virginia. TechnoGen, Inc. (TGI) is a Minority & Women-Owned Small Business with over 20 years of experience providing end-to-end IT Services and Solutions to the Public and Private sectors. TGI provides highly skilled and certied professionals and has successfully executed more than 345 projects. TechnoGen is committed to helping our clients solve complex problems and achieve their goals, on time and under budget. LinkedIn: https://www.linkedin.com/company/technogeninc/about/ Job Title :Martech Developer Required Experience : 5+ years Location : Hyderabad. Job Summary :- Strong knowledge in cloud computing platforms - Google Cloud Expertise in MySQL & SQL/PL Good Experience in IICS Experience in ETL Ascend IO is added advantage GCP & BigQuery knowledge is must, GCP certification is added advantage Good experience in Google Cloud Storage (GCS), Cloud Composer, DAGs , Airflow REST API development experience Good in analytical and problem solving, efficient communication Experience in designing, implementing, and managing various ETL job execution flows. Utilize Git for source version control. Set up and maintain CI/CD pipelines. Troubleshoot, debug, and upgrade existing application & ETL job chains. Comprehensive data analysis across complex data sets Ability to collaborate effectively across technical development teams and business departments Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, or a related field. 5+ years of experience in data engineering or related roles Strong understanding of Google Cloud Platform and associated tools. Proven experience in delivering consumer marketing data and analytics solutions for enterprise clients. Strong knowledge of data management, ETL processes, data warehousing, and analytics platforms. Experience with SQL and NoSQL databases. Proficiency in Python programming languages. Hands-on experience with data warehousing solutions Knowledge of marketing analytics tools and technologies, including but not limited to Google Analytics, Blueconic, Klaviyo, etc. Knowledge of performance marketing concepts such as targeting & segmentation, real-time optimization, A/B testing, attribute modeling, etc. Excellent communication skills with a track record of collaboration across multiple teams Strong collaboration skills and team-oriented mindset. Strong problem-solving skills, adaptability, and the ability to thrive in a dynamic and rapidly changing environment. Best Regards, Syam.M | Sr.IT Recruiter syambabu.m@technogenindia.com www.technogenindia.com | Follow us on LinkedIn
Posted 4 days ago
2.0 - 4.0 years
9 - 18 Lacs
Chennai
Remote
Role & responsibilities Bachelors degree in Computer Science, Engineering, Information Technology, or a related field (or equivalent practical experience). Proven experience as a Data Engineer, Data Architect, or similar role in a data-driven environment. Proficiency in programming languages such as NodeJS , Python or Springboot . Strong SQL skills, with experience in database management (e.g ., MS SQL Server , PostgreSQL , Redshift , BigQuery , etc.). Experience with Azure cloud platforms particularly in data storage and processing services. Hands-on experience with ETL tools and frameworks (e.g., Apache Kafka, Apache Airflow, Talend, etc.). Familiarity with data warehousing solutions and data modeling techniques. Knowledge of big data technologies (e.g., Hadoop, Spark, etc.) and Machine Learning is a plus. Strong understanding of data security principles and best practices. Ability to optimize query performance for Millions of rows of data.
Posted 4 days ago
6.0 - 11.0 years
10 - 14 Lacs
Chennai
Remote
What Youll Need BS or MS degree in Computer Science, Engineering, or a related technical field Strong SQL skills 6+ years of experience working with event instrumentation, data pipelines, and data warehouses, preferably acting as a data architect in a previous role Proficiency with systems design and data modeling Fluency with workflow management tools, like Airflow or dbt Experience with modern data warehouses, like Snowflake or BigQuery Expertise breaking down complex problems, documenting solutions, and sequencing work to make iterative improvements Familiarity with data visualization tools such as Mode, Tableau, and Looker Familiarity with programming skills, preferably in Python Familiarity with software design principles, including test-driven development About the Role Analytics Platform is on a mission to democratize learning by building systems that enable company-wide analytics and experimentation. By implementing sufficient instrumentation, designing intuitive data models, and building batch/streaming pipelines, we will allow for deep and scalable investigation and optimization of the business. By developing self-serve tools, we will empower executives, PMs, Marketing leadership & marketing managers to understand company performance at a glance and uncover insights to support decision making. Finally, by building capabilities such as forecasting, alerting, and experimentation, we will enable more, better, and faster decisions. What Youll Do Drive direct business impact with executive-level visibility Design technical architecture and implement components from the ground up as we transition to event-based analytics Work on the unique challenge of joining a variety of online and offline data sets, not just big data Learn and grow Data Science and Data Analytics skills (we sit in the same org!) Opportunity to grow into a Tech Lead/Manager, and mentor junior team members as we quickly grow the team Partner with infrastructure and product engineers to instrument our backend services and end-to-end user journeys to create visibility for the rest of the business Design, develop and monitor scalable and cost-efficient data pipelines and build out new integrations with third-party tools Work with data analysts and data scientists to design our data models as inputs to metrics and machine learning models Establish the best practices for data engineering Assess build vs buy tradeoffs for components in our company-wide analytics platform, which will inform decision-making for executives, PMs and Ops, etc. Opportunity to be founding member of the Data Engineer team based out of IN. Will have the autonomy to help shape the vision, influence roadmap and establish best practices for the team
Posted 4 days ago
12.0 - 22.0 years
25 - 40 Lacs
Gurugram, Bengaluru
Hybrid
We are looking for an experienced Solution Architect in Generative AI to lead the design and delivery of enterprise-grade GenAI solutions across Azure and Google Cloud platforms. This is a full-stack leadership role requiring deep technical expertise and the ability to guide teams through end-to-end project execution. Key Responsibilities: Lead architecture for GenAI projects: LLMs, embeddings, RAG, prompt engineering, agent frameworks. Define scalable designs across full-stack: React, Node.js, .NET Core, C#, Cosmos DB, SQL/NoSQL, vector DBs. Implement Azure/GCP cloud-native solutions using AKS, GKE, Functions, Pub/Sub. Drive CI/CD automation via GitHub Actions, Azure DevOps, Cloud Build. Conduct architecture/code reviews and enforce security, DevSecOps, and performance standards. Translate business requirements into technical solutions and communicate with senior stakeholders. Mentor engineering teams and promote innovation, collaboration, and agile delivery. Required Skills: Generative AI, LLMs, Prompt Engineering, LangChain, RAG, C#, .NET Core, React, Node.js, Azure, Google Cloud, AKS, GKE, Terraform, CI/CD, Cosmos DB, BigQuery, Microservices, DevSecOps, API Gateway Qualifications: Bachelors in Computer Science or Engineering (Master’s in AI/ML preferred) Strong leadership, communication, and stakeholder management skills Apply Now: shrishtispearhead1@gmail.com Contact: +91 8299010653
Posted 4 days ago
6.0 - 8.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Job Summary Synechron is seeking a highly skilled and proactive Data Engineer to join our dynamic data analytics team. In this role, you will be instrumental in designing, developing, and maintaining scalable data pipelines and solutions on the Google Cloud Platform (GCP). With your expertise, you'll enable data-driven decision-making, contribute to strategic business initiatives, and ensure robust data infrastructure. This position offers an opportunity to work in a collaborative environment with a focus on innovative technologies and continuous growth. Software Requirements Required: Proficiency in Data Engineering tools and frameworks such as Hive , Apache Spark , and Python (version 3.x) Extensive experience working with Google Cloud Platform (GCP) offerings including Dataflow, BigQuery, Cloud Storage, and Pub/Sub Familiarity with Git , Jira , and Confluence for version control and collaboration Preferred: Experience with additional GCP services like DataProc, Data Studio, or Cloud Composer Exposure to other programming languages such as Java or Scala Knowledge of data security best practices and tools Overall Responsibilities Design, develop, and optimize scalable data pipelines on GCP to support analytics and reporting needs Collaborate with cross-functional teams to translate business requirements into technical solutions Build and maintain data models, ensuring data quality, integrity, and security Participate actively in code reviews, adhering to best practices and standards Develop automated and efficient data workflows to improve system performance Stay updated with emerging data engineering trends and continuously improve technical skills Provide technical guidance and support to team members, fostering a collaborative environment Ensure timely delivery of deliverables aligned with project milestones Technical Skills (By Category) Programming Languages: EssentialPython (required) PreferredJava, Scala Data Management & Databases: Experience with Hive, BigQuery, and relational databases Knowledge of data warehousing concepts and SQL proficiency Cloud Technologies: Extensive hands-on experience with GCP services including Dataflow, BigQuery, Cloud Storage, Pub/Sub, and Composer Ability to build and optimize data pipelines leveraging GCP offerings Frameworks & Libraries: Spark (PySpark preferred), Hadoop ecosystem experience is advantageous Development Tools & Methodologies: Agile/Scrum methodologies, version control with Git, project tracking via JIRA, documentation on Confluence Security Protocols: Understanding of data security, privacy, and compliance standards Experience Requirements Minimum of 6-8 years in data or software engineering roles with a focus on data pipeline development Proven experience in designing and implementing data solutions on cloud platforms, particularly GCP Prior experience working in agile teams, participating in code reviews, and delivering end-to-end data projects Experience working with cross-disciplinary teams and understanding varied stakeholder requirements Exposure to industry best practices for data security, governance, and quality assurance is desired Day-to-Day Activities Attend daily stand-up meetings and contribute to project planning sessions Collaborate with business analysts, data scientists, and other stakeholders to understand data needs Develop, test, and deploy scalable data pipelines, ensuring efficiency and reliability Perform regular code reviews, provide constructive feedback, and uphold coding standards Document technical solutions and maintain clear records of data workflows Troubleshoot and resolve technical issues in data processing environments Participate in continuous learning initiatives to stay abreast of technological developments Support team members by sharing knowledge and resolving technical challenges Qualifications Bachelor's or Masters degree in Computer Science, Information Technology, or a related field Relevant professional certifications in GCP (such as Google Cloud Professional Data Engineer) are preferred but not mandatory Demonstrable experience in data engineering and cloud technologies Professional Competencies Strong analytical and problem-solving skills, with a focus on outcome-driven solutions Excellent communication and interpersonal skills to effectively collaborate within teams and with stakeholders Ability to work independently with minimal supervision and manage multiple priorities effectively Adaptability to evolving technologies and project requirements Demonstrated initiative in driving tasks forward and continuous improvement mindset Strong organizational skills with a focus on quality and attention to detail S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice
Posted 4 days ago
2.0 - 4.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Overview We are an integral part of Annalect Global and Omnicom Group, one of the largest media and advertising agency holding companies in the world. Omnicom’s branded networks and numerous specialty firms provide advertising, strategic media planning and buying, digital and interactive marketing, direct and promotional marketing, public relations, and other specialty communications services. Our agency brands are consistently recognized as being among the world’s creative best. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Creative Services, Technology, Marketing Science (data & analytics), Market Research, Business Support Services, Media Services, Consulting & Advisory Services. We currently have 2500+ awesome colleagues (in Annalect India) who are committed to solve our clients’ pressing business issues. We are growing rapidly and looking for talented professionals like you to be part of this journey. Responsibilities Gather and evaluate data requirements by understanding client and project needs to implement scalable and efficient data engineering solutions. Design, build, and manage robust ETL (Extract, Transform, Load) pipelines using Python to support business data workflows and reporting needs. Serve as the primary point of contact for all data engineering-related tasks within the project. Own the data QA process: create and maintain validation checks to ensure data integrity, monitor pipeline health, and resolve data quality issues proactively. Prioritize and manage engineering tasks to ensure timely delivery of clean, reliable, and production-ready data. Contribute to project planning by providing input on data pipeline design, database performance, and infrastructure needs. Create and maintain technical documentation including pipeline logic, data flow diagrams, and troubleshooting guides. Should be able to drive conversation with team, client and business stake holders Qualifications 3–5 years of hands-on experience in data engineering or data management, preferably within data-driven domains, with a strong focus on problem-solving and automation. Design, develop, and maintain scalable Python & SQL based ETL pipelines to process, transform, and integrate large volumes of structured and semi-structured data. Proven experience working with cloud data warehouses, especially Google BigQuery and SQL, managing datasets, optimizing queries, and ensuring data accuracy and availability. Ability to collaborate with analysts and business teams to identify key performance indicators and ensure they are accurately captured and transformed. Excellent written and verbal communication skills to document processes and engage with stakeholders. Comfortable working with large datasets and translating raw data into clean, analysis-ready outputs. Ability to manage multiple data workflows and projects simultaneously, ensuring timely and reliable delivery. Strong written and verbal communication skills. Able to work successfully with teams, handling multiple projects and meeting timelines. Maintaining positive client and vendor relationships.
Posted 4 days ago
5.0 - 10.0 years
18 - 22 Lacs
Pune
Work from Office
About The Role : DB Global Technology is Deutsche Banks technology center in Central and Eastern Europe. Opened in January 2014, the Bucharest office is constantly expanding and is now hosting over 1600 employees. Designed as an agile working environment, custom made to encourage innovation, collaboration, and productivity in a modern high-tech atmosphere. Enterprise Architecture is one of the key pillars of Deutsche Banks IT Strategy and plays a key part in all aspects of defining, managing, governing, and realizing our technology strategy to our clients. The Group Architecture Data team works collaboratively with federated business and technology departments to ensure that Deutsche Bank has a clear target state data architecture whose delivery is managed & supported by appropriate data architecture principles, standards, frameworks, tools, and governance processes. Our ultimate goal is to accelerate the delivery of a bank-wide simplified target architecture, improve technology agility, increase speed-to-market, and reduce cost across our technology landscape The role will work closely with the data architects within Group Architecture and the federated Data Across all locations and functions with Deutsche Bank. The Data Architecture Project/Process Manager role is instrumental in driving the execution and continuous improvement of data architecture processes. This role ensures the effective implementation of data architecture standards, governance, and project delivery methodologies to enhance data quality, integration, and accessibility across the organization. Responsibilities: Lead the design and review of data architecture defined processes, identify opportunities for efficiencies and automation. Track architecture milestones and ensure they are timely and correctly mapped to Clarity and provide end-to-end oversight of the process Oversee the execution and implementation of data architecture processes, ensuring adherence to standards and best practices. Collaborate with domain architects and data stakeholders to ensure consistent application of data architecture frameworks. Conduct data quality assessments and implement improvements to ensure data integrity and completeness. Provide guidance and support to project and program managers on data architecture methodologies and delivery processes. Develop and deliver training materials to upskill teams on data architecture governance and implementation practices. Produce insightful metrics and reports to inform decision-making and support continuous process improvement. Prepare communication materials for senior management and regulatory bodies regarding data architecture initiatives. Skills Proven experience in managing technology projects and driving process improvements in large, complex organizations. Understanding of data architecture principles, standards, and implementation approaches. Project management skills with a track record of delivering technology initiatives on time and within scope. Proficient in data analysis, visualization, and reporting to support architecture governance and decision-making. Effective communicator with the ability to convey complex data architecture concepts to diverse stakeholders. Experience with tools such as JIRA, Confluence, MS Office, and data visualization tools (Looker or Tableau). Strong stakeholder management skills, with the ability to influence and collaborate across global teams. Well-being & Benefits Emotionally and mentally balanced we support you in dealing with life crises, maintaining stability through illness, and maintaining good mental health Empowering managers who value your ideas and decisions. Show your positive attitude, determination, and open-mindedness. A professional, passionate, and fun workplace with flexible Work from Home options. A modern office with fun and relaxing areas to boost creativity. Continuous learning culture with coaching and support from team experts. Physically thriving we support you managing your physical health by taking appropriate preventive measures and providing a workplace that helps you thrive Private healthcare and life insurance with premium benefits for you and discounts for your loved ones. Socially connected we strongly believe in collaboration, inclusion and feeling connected to open up new perspectives and strengthen our self-confidence and wellbeing. Kids@TheOffice - support for unexpected events requiring you to care for your kids during work hours. Enjoy retailer discounts, cultural and CSR activities, employee sport clubs, workshops, and more. Financially secure we support you to meet personal financial goals during your active career and for the future Competitive income, performance-based promotions, and a sense of purpose. 24 days holiday, loyalty days, and bank holidays (including weekdays for weekend bank holidays). We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 4 days ago
7.0 - 12.0 years
32 - 37 Lacs
Pune
Work from Office
About The Role : Job Title Java Engineer - UX, MicroServices LocationPune, India Corporate TitleAVP Role Description We are looking for a talented and experienced software developer with strong technical expertise in Java, Microservices, and Google Kubernetes Engine (GKE). The ideal candidate will have a deep understanding of software development principles and demonstrate excellent problem-solving abilities. This role requires both technical proficiency and strong communication skills to collaborate effectively within a dynamic environment. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Maintain and develop our Price Valuation Platform, ensuring its efficiency and reliability. Write clean, maintainable, and efficient code following industry best practices. Adhere to software development standards, ensuring modular, reusable, and well-documented solutions. Implement rigorous testing strategies, including unit tests, integration tests, and performance optimizations. Collaborate closely with the engineering team and stakeholders to ensure seamless integration of new features and solutions. Your skills and experience Several years of experience in programming. Strong proficiency in Java development. Solid experience in Microservices architecture and Google Kubernetes Engine (GKE) . Experience with Oracle and/or PostgreSQL would be helpful Deep understanding of clean code principles and design patterns. A great team player with excellent collaboration skills. Familiarity with SDLC tools, including Git and Jira. (Optional but highly beneficial:) Experience in financial business and asset management Fluency in written and spoken English Minimum X years of proven expertise in software engineering [ for AVP minimum ] How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We at DWS are committed to creating a diverse and inclusive workplace, one that embraces dialogue and diverse views, and treats everyone fairly to drive a high-performance culture. The value we create for our clients and investors is based on our ability to bring together various perspectives from all over the world and from different backgrounds. It is our experience that teams perform better and deliver improved outcomes when they are able to incorporate a wide range of perspectives. We call this #ConnectingTheDots.
Posted 4 days ago
3.0 - 7.0 years
13 - 18 Lacs
Pune
Work from Office
About The Role : Job Title Technical-Specialist Big Data (PySpark) Developer LocationPune, India Role Description This role is for Engineer who is responsible for design, development, and unit testing software applications. The candidate is expected to ensure good quality, maintainable, scalable, and high performing software applications getting delivered to users in an Agile development environment. Candidate / Applicant should be coming from a strong technological background. The candidate should have goo working experience in Python and Spark technology. Should be hands on and be able to work independently requiring minimal technical/tool guidance. Should be able to technically guide and mentor junior resources in the team. As a developer you will bring extensive design and development skills to enforce the group of developers within the team. The candidate will extensively make use and apply Continuous Integration tools and practices in the context of Deutsche Banks digitalization journey. What well offer you . 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design and discuss your own solution for addressing user stories and tasks. Develop and unit-test, Integrate, deploy, maintain, and improve software. Perform peer code review. Actively participate into the sprint activities and ceremonies e.g., daily stand-up/scrum meeting, Sprint planning, retrospectives, etc. Apply continuous integration best practices in general (SCM, build automation, unit testing, dependency management) Collaborate with other team members to achieve the Sprint objectives. Report progress/update Agile team management tools (JIRA/Confluence) Manage individual task priorities and deliverables. Responsible for quality of solutions candidate / applicant provides. Contribute to planning and continuous improvement activities & support PO, ITAO, Developers and Scrum Master. Your skills and experience Engineer with Good development experience in Big Data platform for at least 5 years. Hands own experience in Spark (Hive, Impala). Hands own experience in Python Programming language. Preferably, experience in BigQuery , Dataproc , Composer , Terraform , GKE , Cloud SQL and Cloud functions. Experience in set-up, maintenance, and ongoing development of continuous build/ integration infrastructure as a part of DevOps. Create and maintain fully automated CI build processes and write build and deployment scripts. Has experience with development platformsOpenShift/ Kubernetes/Docker configuration and deployment with DevOps tools e.g., GIT, TeamCity, Maven, SONAR Good Knowledge about the core SDLC processes and tools such as HP ALM, Jira, Service Now. Strong analytical skills. Proficient communication skills. Fluent in English (written/verbal). Ability to work in virtual teams and in matrixed organizations. Excellent team player. Open minded and willing to learn business and technology. Keeps pace with technical innovation. Understands the relevant business area. Ability to share information, transfer knowledge to expertise the team members. How well support you . . . .
Posted 4 days ago
1.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role : Job TitleSAS CitDev Associate Engineer LocationBangalore, India Corporate TitleAnalyst Role Description We are seeking a skilled AI engineer to design, develop, and maintain AI-powered chatbots and conversational systems using Dialogflow CX, Vertex AI, and Terraform. The candidate will possess strong programming skills in Python and expertise in deploying scalable AI models and infrastructure through Terraform and Google Cloud Platform (GCP). This role involves collaborating with cross-functional teams to deliver intelligent and automated customer service solutions. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design and implement AI-driven chatbots using Dialogflow CX and Vertex AI. Develop, test, and deploy conversational flows, intents, entities, and integrations. Implement Terraform to manage and provision cloud infrastructure for AI models. Utilize GCP services to deploy and manage AI models and infrastructure. Develop, maintain, and optimize Python scripts for data processing, model training, prompting, and deployment. Collaborate with data scientists to integrate ML models within Dialogflow CX. Ensure data security, scalability, and reliability of AI systems. Monitor and debug issues in chatbot performance and provide timely resolutions. Create and maintain technical documentation for AI systems and infrastructure. Your skills and experience Proven experience with Google Cloud Platform (GCP) services, including but not limited to Compute Engine, Cloud Storage, BigQuery, and AI Platform. Strong programming skills in Python. Experience with Terraform for infrastructure as code. Hands-on experience with Dialogflow CX and Vertex AI. Familiarity with deploying and managing scalable AI models and infrastructure. Excellent problem-solving skills and attention to detail. Ability to collaborate effectively with cross-functional teams. How well support you
Posted 4 days ago
8.0 - 12.0 years
30 - 35 Lacs
Pune
Work from Office
About The Role : Job TitleSenior Engineer PD, AVP LocationPune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 4 days ago
8.0 - 13.0 years
32 - 37 Lacs
Bengaluru
Work from Office
About The Role : Job TitleData Modeler, VP LocationBangalore, India Corporate TitleVP Role Description A Passion to Perform. Its what drives us. More than a claim, this describes the way we do business. Were committed to being the best financial services provider in the world, balancing passion with precision to deliver superior solutions for our clients. This is made possible by our peopleagile minds, able to see beyond the obvious and act effectively in an ever-changing global business landscape. As youll discover, our culture supports this. Diverse, international and shaped by a variety of different perspectives, were driven by a shared sense of purpose. At every level agile thinking is nurtured. And at every level agile minds are rewarded with competitive pay, support and opportunities to excel. The Office of the CSO - Data Enablement Tribe brings together the Business, Technology and Operational pillars of the Bank to provide information security services to Deutsche Bank. We are responsible for developing, implementing, maintaining and protecting the entire IT and operations infrastructure required to support all of the Bank's businesses. Overview Data Enablement is responsible for delivering a comprehensive near-time reporting platform covering all CSO controls. The reporting will provide business intelligence for the security posture of all banking applications and portfolios enabling improved risk management practices. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Produce and maintain a large and complex cloud data warehouse data model according to recognised best practice and standards. Capture complex requirements from stakeholders and undertake detailed analysis. Design solutions in terms of physical and logical data models. Communicate data model designs to the ETL and BI development teams and respond to feedback. You will have Thorough knowledge of BI and cloud data warehouse data modelling best practice. Strong knowledge of relational and dimensional data models and experience of data modelling ideally in a cloud environment. Ability to solve complex problems. Experience of working in agile delivery environments. Awareness of data warehouse architectures and data management best practices. Awareness of ETL, Database, Big Data and BI presentation layer technologies. Experience in using Big Query and SAP Power Designer or Sparxx EA. Experience with requirements gathering and documentation using a structured approach. Ability to write SQL and undertake detailed analysis. Experience of working with globally distributed teams. Excellent communication skills. Some understanding of information security and risk is desirable. You will be Able to work in a fast-paced environment Able to deal with sudden change in priorities Open minded and able to share information Able to prioritise effectively Able to work with minimal supervision Your skills and experience SenioritySenior (5+ Years) Competencies o Must Have SQL Data Warehouse Data Modelling o Nice to Have Cloud especially Google Cloud Data Analysis Information Security Financial Services / Cyber Security SAP Business Objects & SAP Power Designer or Sparxx EA How well support you
Posted 4 days ago
3.0 - 8.0 years
3 - 7 Lacs
Pune
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and resolves issues within various components of critical business systems. Your typical day will involve collaborating with team members to troubleshoot software problems, analyzing system performance, and ensuring that applications run smoothly to support business operations effectively. You will engage with users to understand their challenges and work towards implementing solutions that enhance system functionality and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of processes and procedures to enhance team knowledge.- Engage with stakeholders to gather requirements and provide feedback on system performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Electronic Medical Records (EMR).- Strong analytical skills to diagnose and resolve software issues.- Experience with troubleshooting and debugging software applications.- Familiarity with system integration and data flow management.- Ability to communicate technical information effectively to non-technical users. Additional Information:- The candidate should have minimum 3 years of experience in Electronic Medical Records (EMR).- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
3.0 - 8.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives.Experience:- Overall IT experience (No of years) - 7+- Data Modeling Experience - 3+- Data Vault Modeling Experience - 2+ Key Responsibilities:- Drive discussions with clients deal teams to understand business requirements, how Data Model fits in implementation and solutioning- Develop the solution blueprint and scoping, estimation in delivery project and solutioning- Drive Discovery activities and design workshops with client and lead strategic road mapping and operating model design discussions- Design and develop Data Vault 2.0-compliant models, including Hubs, Links, and Satellites.- Design and develop Raw Data Vault and Business Data Vault.- Translate business requirements into conceptual, logical, and physical data models.- Work with source system analysts to understand data structures and lineage.- Ensure conformance to data modeling standards and best practices.- Collaborate with ETL/ELT developers to implement data models in a modern data warehouse environment (e.g., Snowflake, Databricks, Redshift, BigQuery).- Document models, data definitions, and metadata. Technical Experience:Good to have Skills: - 7+ year overall IT experience, 3+ years in Data Modeling and 2+ years in Data Vault Modeling- Design and development of Raw Data Vault and Business Data Vault.- Strong understanding of Data Vault 2.0 methodology, including business keys, record tracking, and historical tracking.- Data modeling experience in Dimensional Modeling/3-NF modeling- Hands-on experience with any data modeling tools (e.g., ER/Studio, ERwin, or similar).- Solid understanding of ETL/ELT processes, data integration, and warehousing concepts.- Experience with any modern cloud data platforms (e.g., Snowflake, Databricks, Azure Synapse, AWS Redshift, or Google BigQuery).- Excellent SQL skills. Good to Have Skills: - Any one of these add-on skills - Graph Database Modelling, RDF, Document DB Modeling, Ontology, Semantic Data Modeling- Hands-on experience in any Data Vault automation tool (e.g., VaultSpeed, WhereScape, biGENIUS-X, dbt, or similar).- Preferred understanding of Data Analytics on Cloud landscape and Data Lake design knowledge.- Cloud Data Engineering, Cloud Data Integration Professional Experience:- Strong requirement analysis and technical solutioning skill in Data and Analytics - Excellent writing, communication and presentation skills.- Eagerness to learn new skills and develop self on an ongoing basis.- Good client facing and interpersonal skills Educational Qualification:- B.E or B.Tech must Qualification 15 years full time education
Posted 4 days ago
3.0 - 8.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to guarantee the quality and functionality of the applications you create, while continuously seeking ways to enhance existing systems and processes. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application specifications and user guides.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of application development methodologies.- Experience with version control systems such as Git.- Familiarity with RESTful APIs and web services. Additional Information:- The candidate should have minimum 3 years of experience in Python (Programming Language).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
2.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Python (Programming Language) Good to have skills : Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform ArchitectureMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with various stakeholders to gather requirements, developing application features, and ensuring that the applications function seamlessly within the existing infrastructure. You will also participate in testing and debugging processes to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in application development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Oracle Procedural Language Extensions to SQL (PLSQL), Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of application development methodologies and best practices.- Experience with version control systems such as Git.- Familiarity with software testing frameworks and debugging tools. Additional Information:- The candidate should have minimum 5 years of experience in Python (Programming Language).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 4 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough