Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 17.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements.
Posted 2 months ago
12.0 - 17.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements.
Posted 2 months ago
5.0 - 7.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 2 months ago
5.0 - 7.0 years
13 - 17 Lacs
Bengaluru
Work from Office
Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets maxIntuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills)
Posted 2 months ago
15.0 - 20.0 years
4 - 8 Lacs
Navi Mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with data warehousing solutions.- Strong understanding of ETL processes and tools.- Familiarity with data governance and data quality frameworks.- Experience in programming languages such as Python or SQL for data manipulation. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems.The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Senior Process Manager Roles and responsibilities: Collaborate with stakeholders to gather and analyze business requirements. Utilize data skills to extract, transform, and analyze data from various sources. Interpret data to identify trends, patterns, and insights. Generate comprehensive reports to present findings to stakeholders. Document business processes, data flows, and requirements. Assist in the development and implementation of data-driven solutions. Conduct ad-hoc analysis as required to support business initiatives Technical and Functional Skills: Bachelors Degree with 5+ years of experience with 3+ years of hands-on experience as a Business Analyst or similar role. Strong data skills with the ability to manipulate and analyze complex datasets. Proficiency in interpreting data and translating findings into actionable insights. Experience with report generation and data visualization tools. Solid understanding of business processes and data flows. Excellent communication and presentation skills. Ability to work independently and collaboratively in a team environment. Basic understanding of Google Cloud Platform (GCP), Tableau, SQL, and Python is a plus. Certification in Business Analysis or related field. Familiarity with Google Cloud Platform (GCP) services and tools. Experience with Tableau for data visualization. Proficiency in SQL for data querying and manipulation. Basic knowledge of Python for data analysis and automation.
Posted 2 months ago
4.0 - 9.0 years
0 - 2 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
At least 3 years of experience in SAC, Datasphere development and security implementation. Experience in developing and maintaining SAP Datasphere Tables, Views, Intelligent Lookup, Data Flow, Task Chain, etc. Experience in building SAP SAC data driven reports. Hands-on experience in creating Data Analytics Reports using SAP Cloud Analytics. Experience in SAP BTP for BW CDS View, Routines, Functional Modules, User Exits, Badi implementation, etc.Experience in SAP Analytics Cloud (SAC) SAC Story, Model, Teams, Roles, etc. Good exposure to SAP BW/4HANA and BI/SAC reporting ADSO, Composite Provider, ODP BW Extractors, Open ODS View, BW Query, Webi, etc."
Posted 2 months ago
6.0 - 10.0 years
15 - 20 Lacs
Hyderabad
Work from Office
Develop, optimize, and maintain scalable data pipelines using Python and PySpark. Design and implement data processing workflows leveraging GCP services such as: BigQuery Dataflow Cloud Functions Cloud Storage
Posted 2 months ago
5.0 - 6.0 years
7 - 8 Lacs
Kolkata
Work from Office
Design, build, and maintain data pipelines on Google Cloud Platform, using tools like BigQuery and Dataflow. Focus on optimizing data storage, processing, and analytics to support business intelligence initiatives.
Posted 2 months ago
4.0 - 5.0 years
6 - 7 Lacs
Hyderabad
Work from Office
Analyze and manage technical aspects of data systems, ensuring accurate data flow and integrity. Work closely with data engineers to support data-driven initiatives.
Posted 2 months ago
6.0 - 10.0 years
14 - 19 Lacs
Coimbatore
Work from Office
We are seeking a Senior Data & AI/ML Engineer with deep expertise in GCP, who will not only build intelligent and scalable data solutions but also champion our internal capability building and partner-level excellence.. This is a high-impact role for a seasoned engineer who thrives in designing GCP-native AI/ML-enabled data platforms. You'll play a dual role as a hands-on technical lead and a strategic enabler, helping drive our Google Cloud Data & AI/ML specialization track forward through successful implementations, reusable assets, and internal skill development.. Preferred Qualification. GCP Professional Certifications: Data Engineer or Machine Learning Engineer.. Experience contributing to a GCP Partner specialization journey.. Familiarity with Looker, Data Catalog, Dataform, or other GCP data ecosystem tools.. Knowledge of data privacy, model explainability, and AI governance is a plus.. Work Location: Remote. Key Responsibilities. Data & AI/ML Architecture. Design and implement data architectures for real-time and batch pipelines, leveraging GCP services such as BigQuery, Dataflow, Dataproc, Pub/Sub, Vertex AI, and Cloud Storage.. Lead the development of ML pipelines, from feature engineering to model training and deployment using Vertex AI, AI Platform, and Kubeflow Pipelines.. Collaborate with data scientists to operationalize ML models and support MLOps practices using Cloud Functions, CI/CD, and Model Registry.. Define and implement data governance, lineage, monitoring, and quality frameworks.. Google Cloud Partner Enablement. Build and document GCP-native solutions and architectures that can be used for case studies and specialization submissions.. Lead client-facing PoCs or MVPs to showcase AI/ML capabilities using GCP.. Contribute to building repeatable solution accelerators in Data & AI/ML.. Work with the leadership team to align with Google Cloud Partner Program metrics.. Team Development. Mentor engineers and data scientists toward achieving GCP certifications, especially in Data Engineering and Machine Learning.. Organize and lead internal GCP AI/ML enablement sessions.. Represent the company in Google partner ecosystem events, tech talks, and joint GTM engagements.. What We Offer. Best-in-class packages.. Paid holidays and flexible time-off policies.. Casual dress code and a flexible working environment.. Opportunities for professional development in an engaging, fast-paced environment.. Medical insurance covering self and family up to 4 lakhs per person.. Diverse and multicultural work environment..
Posted 2 months ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms and services related to data storage and processing.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 months ago
6.0 - 11.0 years
9 - 13 Lacs
Hyderabad
Work from Office
GCPdata engineer Big Query SQL Python Talend ETL ProgrammerGCPor Any Cloud technology. Experienced inGCPdata engineer Big Query SQL Python Talend ETL ProgrammerGCPor Any Cloud technology. Good experience in building the pipeline ofGCPComponents to load the data into Big Query and to cloud storage buckets. Excellent Data Analysis skills. Good written and oral communication skills Self-motivated able to work independently
Posted 2 months ago
6.0 - 11.0 years
9 - 14 Lacs
Hyderabad
Work from Office
1. Oracle Transport Management Cloud Techno/functional consultant with 8 years of expert domain knowledge covering the Planning, execution, Settlement. She/he must have been a part of at least 1-3 end-to-end OTM implementations 2.In-depth understanding of OTM Cloud business processes and their data flow. 3.The candidate should have been in client-facing roles and interacted with customers in requirement-gathering workshops, design, configuration, testing and go-live. 4. Should have strong written and verbal communication skills, personal drive, flexibility, team player, problem-solving, continuous improvement and client management. 5.Assist in identifying, assessing, and resolving complex functional issues/problems. Interact with client frequently around specific work efforts/deliverables 6.Configuring and Managing Technical activities 7. Configuration of Agents, SQL, VPD Profile and Executing as per requirements 8.Identifying areas of improvement and recommending process modifications to enhance operational efficiencies of 9.Configuration of Agents, Milestones, Rates, Locations, Itineraries 10. Managed the integration for OTM with ERPs like SAP/Oracle
Posted 2 months ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Solution Design & Architecture Implementation & Deployment Technical Leadership & Guidance Client Engagement & Collaboration Performance Monitoring & Optimization Your Profile Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 3-8 years of experience in designing, implementing, and managing data solutions. 3-8 years of hands-on experience working with Google Cloud Platform (GCP) data services. Strong expertise in core GCP data services, including BigQuery (Data Warehousing) Cloud Storage (Data Lake) Dataflow (ETL/ELT) Cloud Composer (Workflow Orchestration - Apache Airflow) Pub/Sub and Dataflow (Streaming Data) Cloud Data Fusion (Graphical Data Integration) Dataproc (Managed Hadoop and Spark) Proficiency in SQL and experience with data modeling techniques. Experience with at least one programming language (e.g., Python, Java, Scala). Experience with Infrastructure-as-Code (IaC) tools such as Terraform or Cloud Deployment Manager. Understanding of data governance, security, and compliance principles in a cloud environment. Experience with CI/CD pipelines and DevOps practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a collaborative team. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of "22.5 billion.
Posted 2 months ago
8.0 - 13.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Skill Extensive experience with Google Data Products (Cloud Data Fusion,BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Dataprep, etc.). Expertise in Cloud Data Fusion,BigQuery & Dataproc Experience in MDM, Metadata Management, Data Quality and Data Lineage tools. E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Experience with SQL and NoSQL modern data stores. E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy. Excellent knowledge of the software development life cycle
Posted 2 months ago
0.0 years
0 - 0 Lacs
Hyderabad
Work from Office
Fresher Opportunities (01 Year) at Seanergy.ai Location: Hyderabad (On-site) | Experience: 01 Year Were hiring fresh minds with strong foundational skills and a passion for learning! If you're a recent graduate looking to kickstart your career in a tech-driven environment, check out our openings below: 1. React Developer (Fresher) Build dynamic web applications using React and modern UI tools. Must-Have: React.js basics (components, state, props, hooks) JavaScript, HTML, CSS Git version control Understanding of responsive design Bonus: Next.js, Tailwind CSS, Figma, API integration, GitHub portfolio 2. AI/ML Engineer (Fresher) Work on ML models, data preprocessing, and API integrations. Must-Have: Strong Python fundamentals Libraries: scikit-learn, pandas, NumPy Basic understanding of ML algorithms and model evaluation Exposure to OOPs concepts and REST APIs Bonus: FastAPI, HuggingFace, LangChain, cloud deployment (Azure/GCP), GitHub projects 3. Data Analyst (Fresher) Analyze data, build reports, and provide business insights. Must-Have: SQL (Joins, aggregations, filtering) Excel/Google Sheets proficiency Basic Power BI or Tableau knowledge Exposure to Microsoft Fabric components (Dataflows, Lakehouse, Datasets) Data cleaning using Python or Excel Bonus: DAX basics, pandas, statistics, storytelling with data 4. QA Tester (Fresher) Ensure quality of web apps and data platforms through manual testing. Must-Have: Knowledge of SDLC/STLC Manual testing fundamentals (test cases, bug reports) Basic SQL for data validation Understanding of web elements and browser dev tools Bonus: API testing via Postman, Playwright/Selenium basics, test case writing tools (JIRA/TestRail) Eligibility Criteria (Applicable for All Roles): Educational Qualification: B.E. / B.Tech / MCA (preferably from a Computer Science or related background) Strong communication skills and logical thinking Passionate about continuous learning and career growth Personal projects, GitHub portfolios, or relevant internships are a strong plus!
Posted 2 months ago
4.0 - 9.0 years
10 - 20 Lacs
Bengaluru
Remote
Job Title: Software Engineer GCP Data Engineering Work Mode: Remote Base Location: Bengaluru Experience Required: 4 to 6 Years Job Summary: We are seeking a Software Engineer with a strong background in GCP Data Engineering and a solid understanding of how to build scalable data processing frameworks. The ideal candidate will be proficient in data ingestion, transformation, and orchestration using modern cloud-native tools and technologies. This role requires hands-on experience in designing and optimizing ETL pipelines, managing big data workloads, and supporting data quality initiatives. Key Responsibilities: Design and develop scalable data processing solutions using Apache Beam, Spark, and other modern frameworks. Build and manage data pipelines on Google Cloud Platform (GCP) using services like Dataflow, Dataproc, Composer (Airflow), and BigQuery . Collaborate with data architects and analysts to understand data models and implement efficient ETL solutions. Leverage DevOps and CI/CD best practices for code management, testing, and deployment using tools like GitHub and Cloud Build. Ensure data quality, performance tuning, and reliability of data processing systems. Work with cross-functional teams to understand business requirements and deliver robust data infrastructure to support analytical use cases. Required Skills: 4 to 6 years of professional experience as a Data Engineer working on cloud platforms, preferably GCP . Proficiency in Java and Python with strong problem-solving and analytical skills. Hands-on experience with Apache Beam , Apache Spark , Dataflow , Dataproc , Composer (Airflow) , and BigQuery . Strong understanding of data warehousing concepts and ETL pipeline optimization techniques. Experience in cloud-based architectures and DevOps practices. Familiarity with version control (GitHub) and CI/CD pipelines . Preferred Skills: Exposure to modern ETL tools and data integration platforms. Experience with data governance, data quality frameworks , and metadata management. Familiarity with performance tuning in distributed data processing systems. Tech Stack: Cloud: GCP (Dataflow, BigQuery, Dataproc, Composer) Programming: Java, Python Frameworks: Apache Beam, Apache Spark DevOps: GitHub, CI/CD tools, Composer (Airflow) ETL/Data Tools: Data ingestion, transformation, and warehousing on GCP
Posted 2 months ago
8.0 - 12.0 years
30 - 35 Lacs
Pune
Work from Office
: Job Title- Senior Engineer PD Location- Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 2 months ago
10.0 - 15.0 years
19 - 25 Lacs
Pune
Work from Office
: Job Title- IT Architect Location- Pune Role Description The Business Architect defines the technical solution design of specific IT platforms and provides guidance to the squad members in order to design, build, test and deliver high quality software solutions. A key element in this context is translation of functional and non-functional business requirements into an appropriate technical solution design, leveraging best practices and consistent design patterns. The Business Architect collaborates closely with Product Owners Chapter leads and Squad members to ensure consisten adherence to the agreed-upon application design and is responsible for maintaining an appropriate technical design documentation. The Solution architect ensures that the architectures and designs of solutions conform to the principles blueprints, standards, patterns etc,that have been established by the Enterprise Architecture in this context the Business Architect is closely collaborating with the respective solution Architect to ensure architecture compliance. The business Architect also actively contributes to the definition and enrichment of design patterns and standards with the aim to leverage those across squads and tribes. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Define the technical Architecture of IT Solutions in line with functional and non-functional requirements following consistent design patterns and best practices. Ensure that the solution design is in sync with WM target Architecture blueprints and principles, as well as with overarching DB architecture and security standards. Create appropriate technical design documentation and ensure this is kept up-to-date. Provide guidance to the squad members to design, build, test and deliver high quality software solutions in line with business requirements Responsible for all aspects of the solution architecture (i.e. Maintainablity, scalability, effective integration with other solutions, usage of shared solutions and components where possible, optimization of the resource consumption etc. ) with the object to meet the appropriate balance between business needs and total cost of ownership Closely collaborate with enterprise architecture to ensure architecture compliance and make sure that any design options are discussed in a timely manner to allow sufficient time for deliberate decision taking Present architecture proposals to relevant forums along with enterprise architect at different levels and drive the process to gain the necessary architecture approvals. Collaborate with relevant technology stakeholders within other squads and across tribes to ensure the cross-squad and cross-tribe solution architecture synchronization and alignment Contribute to definition and enrichment of appropriate design patterns and standards that can be leveraged across WM squads / tribes Serve as a Counsel to designers and developers and carry out reviews of software designs and high level detailed level design documentation provided by other squad members Lead the technical discussions with CCO, Data factory, Central Data quality and Complience, end to end and control functions for technical queries contribute to peer level solution architecture reviews e.g. within a respective chapter Your skills and experience Ability / experience in defining the high level and low level technical solution designs for complex initiatives very good analytical skills and ability to oversee / structure complex tasks Hands on skills with various google cloud components like storage buckets, BigQuery, Dataproc, cloud composer, cloud functions etc aling with Pyspark, Scala is essential,. Good to have experience in Cloud SQL, Dataflow, Java and Unix Experience with implementing a google cloud based solution is essesntial persuasive power and persistence in driving adherence to solution design within the squad Ability to apply the appropriate architectural patterns considering the relevant functional and nonfunctional requirements proven ability to balance business demands and IT capabilities in terms of standardization reducing risk and increasing the IT flexibility comfortable working in an open, highly collaborative team ability to work in an agile and dynamic environment and to build up the knowledge related to new technology/ solutions in an effective and timely manner ability to communicate effectively with other technology stakeholders feedbackseek feedback from others, provides feedback to others in support of their development and is open and honest while dealing constructively with criticism inclusive leadershipvalues individuals and embraces diversity by integrating differences in promoting diversity and inclusion across teams and functions coachingunderstands and anticipates people's needs skills and abilities in order to coach, motivate and empower them for success broad set of architecture knowledge and application design skills and - depending on the specific squad requirements - in-depth expertise with regards to specific architecture domains (e.g. service and integration architecture web and mobile front end architecture guitar architecture security architecture infrastructure architecture) and related technology stacks and design patterns experience in establishing thought leadership in solution architecture practices and ability to lead design and development teams and defining building and delivering first class software solutions familiar with current and emerging technologies, tools, frameworks and design patterns experience in effectively collaborating across multiple teams and geographies ability to appropriately consider other dimensions(e.g. financials, risk, time to market) On top of the architecture drivers in order to propose balanced and physical architecture solutions Experience / Qualifications : 10+ years relevant experience as technology Manager within the IT support industry experience in financial /banking industry preferred Minimum 8 years' experience supporting Oracle Platform in a mid-size to large corporate environment Preferably from Banking / Wealth Management experience Must have experience working in agile organization. How well support you
Posted 2 months ago
4.0 - 8.0 years
10 - 15 Lacs
Pune
Work from Office
: Job Title - GCP - Senior Engineer - PD Location - Pune Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Big Data and Google Cloud area. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (HDFS, BigQuery, etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you
Posted 2 months ago
3.0 - 7.0 years
10 - 14 Lacs
Pune
Work from Office
: Job Title GCP Data Engineer, AS LocationPune, India Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns. They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have APIGEE. Good to have Bit Bucket How well support you . . .
Posted 2 months ago
3.0 - 7.0 years
8 - 13 Lacs
Pune
Work from Office
: Job Title- Senior Engineer PD Location- Pune, India Role Description Our team is part of the area Technology, Data, and Innovation (TDI) Private Bank. Within TDI, Partner data is the central client reference data system in Germany. As a core banking system, many banking processes and applications are integrated and communicate via >2k interfaces. From a technical perspective, we focus on mainframe but also build solutions on premise cloud, restful services, and an angular frontend. Next to the maintenance and the implementation of new CTB requirements, the content focus also lies on the regulatory and tax topics surrounding a partner/ client. We are looking for a very motivated candidate for the Cloud Data Engineer area. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities You are responsible for the implementation of the new project on GCP (Spark, Dataproc, Dataflow, BigQuery, Terraform etc) in the whole SDLC chain You are responsible for the support of the migration of current functionalities to Google Cloud You are responsible for the stability of the application landscape and support software releases You also support in L3 topics and application governance You are responsible in the CTM area for coding as part of an agile team (Java, Scala, Spring Boot) Your skills and experience You have experience with databases (BigQuery, Cloud SQl, No Sql, Hive etc.) and development preferably for Big Data and GCP technologies Strong understanding of Data Mesh Approach and integration patterns Understanding of Party data and integration with Product data Your architectural skills for big data solutions, especially interface architecture allows a fast start You have experience in at leastSpark, Java ,Scala and Python, Maven, Artifactory, Hadoop Ecosystem, Github Actions, GitHub, Terraform scripting You have knowledge in customer reference data, customer opening processes and preferably regulatory topics around know your customer processes You can work very well in teams but also independent and are constructive and target oriented Your English skills are good and you can both communicate professionally but also informally in small talks with the team How well support you
Posted 2 months ago
8.0 - 13.0 years
35 - 50 Lacs
Hyderabad
Hybrid
Location: Hyderabad Exp: 8+ Years Immediate Joiners Preferred We at Datametica Solutions Private Limited are looking for a GCP Data Architect who has a passion for cloud, with knowledge and working experience of GCP Platform. This role will involve understanding business requirements, analyzing technical options and providing end to end Cloud based ETL Solutions. Required Past Experience: 10 + years of overall experience in architecting, developing, testing & implementing Big data projects using GCP Components (e.g. BigQuery, Composer, Dataflow, Dataproc, DLP, BigTable, Pub/Sub, Cloud Function etc.). Experience and understanding on ETL - AB initio Minimum 4 + years experience with data management strategy formulation, architectural blueprinting, and effort estimation. Cloud capacity planning and Cost-based analysis. Worked with large datasets and solving difficult analytical problems. Regulatory and Compliance work in Data Management. Tackle design and architectural challenges such as Performance, Scalability, and Reusability Advocate engineering and design best practices including design patterns, code reviews and automation (e.g., CI|CD, test automation) E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management. Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform Fundamentals of Kafka,Pub/Sub to handle real-time data feeds. Good Understanding of Data Pipeline Design and Data Governance concepts Experience in code deployment from lower environment to production. Good communication skills to understand business requirements. Required Skills and Abilities: Mandatory Skills - BigQuery ,Composer, Python, GCP Fundamentals. Secondary Skills Dataproc, Kubernetes, DLP, Pub/Sub, Dataflow,Shell Scripting,SQL, Security(Platform & Data) concepts. Expertise in Data Modeling Detailed knowledge of Data Lake and Enterprise Data Warehouse principles Expertise in ETL Migration from On-Primes to GCP Cloud Familiar with Hadoop ecosystems, HBase, Hive, Spark or emerging data mesh patterns. Ability to communicate with customers, developers, and other stakeholders. Good To Have - Certifications in any of the following: GCP Professional Cloud Architect, GCP Professional Data Engineer Mentor and guide team members Good Presentation skills Strong Team Player
Posted 2 months ago
2.0 - 7.0 years
0 - 0 Lacs
Visakhapatnam
Work from Office
SUMMARY Job Summary: Exciting job opportunity as a Registered Nurse in Qatar (Homecare) Key Responsibilities: Develop and assess nursing care plans Monitor vital signs and assess holistic patient needs Collaborate with physicians, staff nurses, and healthcare team members Administer oral and subcutaneous medications while ensuring safety Document nursing care, medications, and procedures using the company's Nurses Buddy application Conduct client assessment and reassessment using approved tools Attend refresher training courses, seminars, and training Timeline for Migration: Application to Selection: Not more than 5 days Data flow & Prometric: 1 month Visa processing: 1-2 months Start working in Qatar within 3 months! Requirements: Educational Qualification: Bachelor's Degree in Nursing or GNM Experience: Minimum 2 years working experience as a Nurse post registration Citizenship: Indian Age limit: 18 to 40 years Certification: registration Certification from Nursing Council Language: Basic English proficiency required Technical Skills: Bed side nursing, patient care, patient assessment and monitoring Benefits: High Salary & Perks: Earn 5000 QAR / month (1,18,000 INR/month) Tax Benefit: No tax deduction on salary Career Growth: Advanced Nursing career in Qatar with competitive salaries, cutting-edge facilities, and opportunities for specialization Relocation support: Visa process and flight sponsored. Free accommodation and transportation provided. International Work Experience: Boost your resume with International healthcare expertise. Comprehensive Health Insurance: Medical coverage for under Qatar’s healthcare system. Safe and stable environment: Qatar is known for its low crime rate, political stability, and high quality of life. The strict laws in the country, makes it one of safest place to live. Faster Visa Processing With efficient government procedures, work visas for nurses are processed quickly, reducing waiting times. Simplified Licensing Process Compared to other countries, Qatar offers a streamlined process for obtaining a nursing license through QCHP (Qatar Council for Healthcare Practitioners) . Direct Hiring Opportunities Many hospitals and healthcare facilities offer direct recruitment , minimizing third-party delays and complications. Limited slots available! Apply now to secure your place in the next batch of Nurses migrating to Qatar!
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |