Jobs
Interviews

905 Data Flow Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

13 - 18 Lacs

Coimbatore

Work from Office

Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : AWS Analytics Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly across the organization, contributing to the overall efficiency and effectiveness of data management practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Analytics.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud data storage solutions and architectures.- Ability to analyze and optimize data workflows for performance. Additional Information:- The candidate should have minimum 5 years of experience in AWS Analytics.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Gurugram

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : Data Engineering, Cloud Data MigrationMinimum 7.5 year(s) of experience is required Educational Qualification : BE or BTech must Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your role involves creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Develop and maintain data solutions for data generation, collection, and processing.- Create data pipelines to ensure efficient data flow.- Implement ETL processes for data migration and deployment. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with Data Engineering.- Strong understanding of data modeling techniques and methodologies.- Experience in cloud data migration.- Knowledge of data engineering principles.- Proficient in ETL processes. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Gurugram office.- A BE or BTech degree is required. Qualification BE or BTech must

Posted 2 months ago

Apply

13.0 - 18.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Manager Qualifications: Any Graduation Years of Experience: 13 to 18 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationUnderstanding of foundational principles and knowledge of Artificial Intelligence AI including concepts, techniques, and tools in order to use AI effectively. What are we looking for Problem-solving skillsAbility to perform under pressureResults orientationStrong analytical skillsWritten and verbal communication Roles and Responsibilities: In this role you are required to identify and assess complex problems for area of responsibility The person would create solutions in situations in which analysis requires an in-depth evaluation of variable factors Requires adherence to strategic direction set by senior management when establishing near-term goals Interaction of the individual is with senior management at a client and/or within Accenture, involving matters that may require acceptance of an alternate approach Some latitude in decision-making in involved you will act independently to determine methods and procedures on new assignments Decisions individual at this role makes have a major day to day impact on area of responsibility The person manages large - medium sized teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 2 months ago

Apply

4.0 - 9.0 years

12 - 22 Lacs

Pune, Chennai, Mumbai (All Areas)

Hybrid

Locations: Gurgaon, Noida, Pune, Mumbai, Chennai, Bangalore Mandatory Skills : BigQuery, Airflow, GCS, Python, SQL, Pyspark Essential Skills and experience: Excellent Programming skills in Python with object-oriented design Excellent knowledge in current computing trends and technologies Working experience in Generative AI (GenAI) technologies is preferred. Experience in designing and implementing cloud infrastructure, platforms, and applications Hands-On Experience on Infrastructure as a Code using terraform /pulumi /typescript in GCP Hands-On Experience with Google Cloud platform & product deployment and automation Hands-On Experience with Multi cloud API building In-depth knowledge of Kubernetes, K8s administration, devops , Databases, and CI/CD Knowledge of commonly used tools/components/technologies in applications Flyte/Airflow, Ray, MLflow, Elastic search, Kibana, etc. Strong understanding across cloud and infrastructure components (server, storage, network, data, and applications) to deliver end to end cloud infrastructure, architectures, and designs Excellent logical, analytical, debugging and problem-solving skills Key Job Responsibilities: Design and deploy cloud environments with a focus on GCP and demonstrate technical cloud architectural knowledge, playing a vital role in the design of production, staging, QA, and deployment of cloud infrastructures running in 247 environments Architect End to end scalable and Optimized data solutions for data ingestion, curation, storage, data pipelines for AI/ML development and deliver to the end user based on business need Design, Develop and deploy various data pipelines based on the use case scenarios Maintain clear and coherent communication, both verbal and written, to understand data needs and report results Continuously Ideates and improvises on the requirements and apply new techniques and methods Able to work in & contribute as a good team player

Posted 2 months ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Design, build, and maintain efficient, reusable, and reliable data pipelines using GCP services. Develop batch and streaming ETL processes using PySpark, and BigQuery. Write clean and efficient code in Python for data ingestion and transformation

Posted 2 months ago

Apply

6.0 - 11.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Involved in all aspects of a project and has dual ability to maintain the broad vision required for design development of a project, including strategic thinking and leadership. Oversees the intricate details of a project from inception through launch. Has a strong understanding of the Client's business, industry, economic model, organizational trends and customer needs in order to lead with relevant digital marketing solutions. Partner across teams to recommend and determine appropriate project execution models (waterfall and/or agile practices) and determine project engagement type (Retainer, Fixed Fee, Time and Materials) as part of solutioning. Establishes and maintains a Center of excellence for every project with Client, and/or Subject Matter Experts (SMEs). Serves as a central point of contact for project estimates, utilizing department leads and SMEs to determine estimates for their team's activities. Facilitates the creation of accurate project plans with clearly defined milestones, deliverables and tasks. Work with department SMEs to determine department level deliverables and create resource allocation/staffing plans for the lifecycle of the project. Experience with IBM Design Thinking, Agile, DevOps, Scrum, SAFe, LeSS and SDLC. Design major aspects of the architecture of an application including components, UI, middleware and databases Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise For a given scope area, evaluate architectural models and perform/drive in-depth analysis of systems, data flows processes, and KPIs/metrics about the current state Develop understanding of business processes, data flows, strategy, and long-term thinking to come up with an end-state architecture for large and complex systems Full-stack software Architecture Expertise, Designing and Developing full stack modules and components for web applications (frontend and backend services) Working experience on MEAN (Mongo, Express, Angular, Node), MERN (Mongo, Express, React, Node) stacks Consumer Web Development Experience for High-Traffic, Public Facing web applications Preferred technical and professional experience Experience in working with AB Test frameworks such as Optimizely, Experience in using front end monitoring tools to troubleshoot errors and recognize performance bottlenecks Experience in designing UX for simplifying user experience and dashboards for viewing high volume of information Expertise in hosting and configuring Data Annotation tools, defining meta data for media types such as Images, Audio, Video, model-based data capture, Preferred Experience in Playing liaison role with ML Engineers, Data Scientists, Data Analysts to translate business requirements to conceptual designs

Posted 2 months ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Role & responsibilities 5+ years experience in a Data Analyst role as part of a transformation project or as a member of a data engineering or data science delivery team. 5+ years experience with data analysis tools and reporting, also experience with RDBMS and dimensional data modelling. 5+ years experience with SQL for data analysis, querying, and manipulation. Experience in identifying and defining requirements and turning them into functional requirements that address complex analytical challenges. Experience in solving data analytics problems and effectively communicating results and methodologies Knowledge of data, master data and metadata-related standards, processes and technologies Knowledge of (SDLC) methodologies (Agile methodology experience preferable). Strong communication and collaboration skills, with the ability to interact effectively with stakeholders at all levels. Ability to work independently and manage multiple tasks and priorities in a dynamic environment. Knowledge of alternative assets will be a plus.

Posted 2 months ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 5 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale data pipelines using GCP services such as BigQuery, Data Flow, PubSub, Dataproc, and Cloud Storage. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop complex SQL queries to extract insights from large datasets stored in Google Cloud SQL databases. Troubleshoot issues related to data processing workflows and provide timely resolutions. Desired Candidate Profile 5-9 years of experience in Data Engineering with expertise GCP & Biq query data engineering. Strong understanding of GCP Cloud Platform Administration including Compute Engine (Dataproc), Kubernetes Engine (K8s), Cloud Storage, Cloud SQL etc. . Experience working on big data analytics projects involving ETL processes using tools like Airflow or similar technologies.

Posted 2 months ago

Apply

3.0 - 8.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings! We have permanent opportunities for GCP Data Engineers in Chennai Location . Experience Required : 3 Years and above Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Skill Required: GCP Data Engineer, Advanced SQL, ETL Data pieplines, BigQuery, Dataflow, Bigtable, Data fusion, cloud spanner, python, java, javascript, If interested, kindly share the below details along with updated CV and to Narmadha.baskar @getronics.com Regards, Narmadha Getronics Recruitment team

Posted 2 months ago

Apply

6.0 - 11.0 years

15 - 25 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Warm welcome from SP Staffing Services! Reaching out to you regarding permanent opportunity !! Job Description: Exp: 6-12 yrs Location: Hyderabad/Bangalore/Pune/Delhi Skill: GCP Data Engineer - Proficiency in programming languages: Python - Expertise in data processing frameworks: Apache Beam (Data Flow), Kafka, - Hands-on experience with GCP services: Big Query, Dataflow, Composer, Spanner - Knowledge of data modeling and database design - Experience in ETL (Extract, Transform, Load) processes - Familiarity with cloud storage solutions - Strong problem-solving abilities in data engineering challenges - Understanding of data security and scalability - Proficiency in relevant tools like Apache Airflow Interested can share your resume to sangeetha.spstaffing@gmail.com with below inline details. Full Name as per PAN: Mobile No: Alt No/ Whatsapp No: Total Exp: Relevant Exp in GCP: Rel Exp in Big Query: Rel Exp in Composer: Rel Exp in Python: Current CTC: Expected CTC: Notice Period (Official): Notice Period (Negotiable)/Reason: Date of Birth: PAN number: Reason for Job Change: Offer in Pipeline (Current Status): Availability for virtual interview on weekdays between 10 AM- 4 PM(plz mention time): Current Res Location: Preferred Job Location: Whether educational % in 10th std, 12th std, UG is all above 50%? Do you have any gaps in between your education or Career? If having gap, please mention the duration in months/year:

Posted 2 months ago

Apply

5.0 - 10.0 years

25 - 35 Lacs

Bengaluru

Remote

Responsibilities : Senior Google Cloud Platform (GCP) Administrator will be responsible for deploying, automating, and maintaining Cloud Infrastructure, as well as testing and supporting a diverse range of environments. In addition, the person is responsible for entire functioning of our GCP environment. Experience Required: 5+ years experience in GCP, Big Query administration Must have experience in various cloud services in GCP (Dataflow, composer, Cloud Function, BigQuery, AppEngine, LB and GKE) and AWS. Experience in Cloud Architecture and Design, both GCP and AWS. Experience in Tableau and Power BI preferred Collaborate with stakeholders to define cloud strategies that align with business goals. Develop and implement automated deployment pipelines for multi-cloud environments. Automate provisioning using terraform(IaC), scaling, and monitoring processes. Implement DevOps best practices, such as Continuous Integration/Continuous Deployment (CI/CD), version control, and automated testing. Utilize DevOps tools like Jenkins and GitHub Actions. Implement security best practices for multi-cloud environments, including identity and access management (IAM), encryption, and compliance. Manage Enterprise and Open source Kafka clusters Automation in Configuration like Patching and Setup of New Kafka Cluster using Ansible (roles, modules, Jinga template). Automation of various manual activities using Python. Experience in Docker and Kubernetes(K8s) The ability to work completely independently or with a team. Excellent verbal and written communication skills Experience in Informatica Excellent verbal and written communication skills Very detail-oriented in planning, implementation, documentation, and follow-up The ability to work completely independently or with a team Team player attitude with experience working in a collaborative environment.

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai

Work from Office

Role Definition Plan and execute the removal, modification, rework, and installation of package controls and package systems to upgrade industrial gas turbine packages to customer specifications and schedule requirements. Apply knowledge in related turbo-machinery fields, and conform to all EHS (Environment, Health, Safety & Security), quality, electrical, and solar standards during the performance of duties. Responsibilities Use work permit program understanding and compliance to execute job responsibilities. Participate in general safety meetings/briefings and submit safety suggestions as appropriate. Plan, execute, and/or assist in the removal of obsolete material including control systems, inter-connect wiring, package system components, and cold loop checks prior to demobilizing, and conduit/cable tray & tubing per project specifications. Install and/or assist in the placement of new control consoles. Plan, develop, and execute the layout for all new package system components, replacement, and/or modification of all conduit/cable tray & tubing necessary to accommodate new controls, components, and package systems per design specifications. Rewire package junction box(es), new components, and package inter-connect wiring per engineering specifications. Provide leadership and customer support on projects of lower complexity and support technical and administrative development of less experienced field technicians. Skill Descriptors Service Excellence Knowledge of customer service concepts and techniques; ability to meet or exceed customer needs and expectations and provide excellent service directly or indirectly. Provides a quality of service described by customers as excellent. Resolves common customer problems. Responds to unexpected customer requests with a sense of urgency and positive action. Provides direct service to internal or external customers. Documents customer complaints in a timely manner. Initiative Being proactive and committing to action on self-identified job responsibilities and challenges; ability to seek out work and the drive to accomplish goals. Identifies and exploits own strengths; minimizes limitations. Provides appropriate degrees of attention to both personal and professional priorities. Explains how own motivation relates to the workplace. Utilizes available tools or approaches to increase knowledge of self-motivation. Learns and uses resources the organization has to assess and enhance team motivation. Problem Solving Knowledge of approaches, tools, and techniques for recognizing, anticipating, and resolving organizational, operational, or process problems; ability to apply problem-solving knowledge appropriately to diverse situations. Identifies and documents specific problems and resolution alternatives. Examines a specific problem and understands the perspective of each involved stakeholder. Develops alternative techniques for assessing accuracy and relevance of information. Helps to analyze risks and benefits of alternative approaches and obtain decisions on resolution. Uses fact-finding techniques and diagnostic tools to identify problems. Technical Excellence Knowledge of a given technology and various application methods; ability to develop and provide solutions to significant technical challenges. Provides effective technical solutions to routine functional challenges via sound technical competence, effectively examining implications of events and issues. Effectively performs the technical job aspects, continuously building knowledge and keeping up-to-date on technical and procedural job components. Applies technical operating and project standards based on achieving excellence in delivered products, technologies, and services. Applies current procedures and technologies to help resolve technical issues in ones general area of technical competence. Helps others solve technical or procedural problems or issues. Power Generation Knowledge of working principles, methods, equipment, and processes of power generation; ability to apply the knowledge appropriately within the power supply sector. Explains the roles and responsibilities of power generation within the electric power industry. Identifies the features and properties of the power generation sector. Describes the working principles of turbines and power generators. Documents relevant laws and regulations within the power generation sector. Safety (Oil and Gas) Knowledge of procedures, practices, considerations, and regulatory requirements for the safety and protection of workers, community, environment, and company assets; ability to identify and respond accordingly to work-related hazards. Describes own experience working with safety practices and equipment. Discusses procedures for identifying and reporting safety violations and accidents. Relates incidents with product-specific hazards and associated first aid response. Identifies training and documentation on safety and injury prevention procedures. Identifies personal protective equipment required or recommended for manufacturing staff. Oil and Gas Equipment Knowledge of various types of equipment used in the oil and gas industry and the systems and processes involved in the exploration, production, and refining of oil and gas; ability to operate, maintain, troubleshoot, and repair equipment used in the oil and gas industry. Demonstrates an understanding of basic principles of pumps, compressors, and other equipment used in the oil and gas industry. Understands the purpose and function of common oil and gas equipment, such as separators, heat exchangers, and valves. Describes common types of oil and gas equipment and explains their basic operation. Explains basic principles of hydraulic and pneumatic systems used in oil and gas equipment. Troubleshooting Technical Problems Knowledge of troubleshooting approaches, tools, and techniques; ability to anticipate, detect, and resolve technical problems effectively.

Posted 2 months ago

Apply

12.0 - 19.0 years

30 - 40 Lacs

Pune, Chennai, Bengaluru

Work from Office

Strong understanding of data warehousing and data modeling Proficient understanding of distributed computing principles - Hadoop v2, MapReduce, HDFS Strong data engineering skills on GCP cloud platforms Airflow, Cloud Composer, Data Fusion, Data Flow, Data Proc, Big Query Experience with building stream-processing systems, using solutions such as Storm or Spark- Streaming Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala Experience with Spark, SQL, and Knowledge of various ETL techniques and frameworks, such as Flume, Apache NiFi, or Experience with various messaging systems, such as Kafka or Good understanding of Lambda Architecture, along with its advantages and drawbacks

Posted 2 months ago

Apply

6.0 - 11.0 years

16 - 27 Lacs

Gurugram

Remote

Job Title: Workday Prism and Reporting Analyst Contract: 6 months Shift: 7:30 pm to 4:30 am IST Location: 100% remote Years of experience: 6+ years Job Summary The Workday Prism and Reporting Analyst is responsible for designing, developing, and maintaining advanced reporting and analytics solutions using Workday Prism Analytics and Workdays native reporting tools. This role supports strategic decision-making by delivering accurate, timely, and insightful data visualizations and reports across HR, Payroll, and Financials. Key Responsibilities: Prism Data Modeling & Integration. Ingest and transform data from Workday business objects (e.g., Payroll, Time Tracking) and external sources into Prism Data Sources (PDS). Build and maintain derived datasets and semantic layers to support reporting use cases such as payroll vs. budget analysis by census Report Development Design and deploy custom reports, dashboards, and worksheets using Workday Advanced, Composite, and Matrix reporting tools. Collaborate with stakeholders to gather requirements and translate them into actionable reporting solutions Data Governance & Accuracy Ensure data integrity and security through row-level security configurations and domain-based access controls. Conduct regular audits and validations to maintain high-quality reporting outputs. Stakeholder Collaboration Serve as a subject matter expert (SME) for Workday reporting and Prism Analytics. Provide training and documentation to end users and HRIS team members on reporting best practices Project Leadership Lead or support reporting workstreams in Workday implementation, optimization, and post-production phases. Participate in cross-functional initiatives involving data strategy, compliance, and operational efficiency Qualifications Bachelors degree in Computer Science, Information Systems, HR, or related field. 35 years of experience in Workday reporting, including at least 12 years with Workday Prism Analytics. Proficiency in Workday Report Writer, Calculated Fields, Discovery Boards, and Prism Data Flows. Strong understanding of Workday HCM and Financials modules. Certifications in Workday Reporting, Prism, or Integrations preferred. Preferred Skills Experience with data visualization tools (e.g., Power BI, Tableau) is a plus. Familiarity with Workday Extend, BIRT, and integration tools like Studio or EIB. Excellent communication and stakeholder management skills. Ability to work independently and manage multiple priorities in a fast-paced environment.

Posted 2 months ago

Apply

5.0 - 7.0 years

19 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

We are seeking a mid-level GCP Data Engineer with 4+ years of experience in ETL, Data Warehousing, and Data Engineering. The ideal candidate will have hands-on experience with GCP tools, solid data analysis skills, and a strong understanding of Data Warehousing principles. Qualifications: 4+ years of experience in ETL & Data Warehousing Should have excellent leadership & communication skills Should have experience in developing Data Engineering solutions Airflow, GCP BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, etc. Should have built solution automations in any of the above ETL tools Should have executed at least 2 GCP Cloud Data Warehousing projects Should have worked at least 2 projects using Agile/SAFe methodology Should Have mid-level experience in Pyspark and Teradata Should Have mid-level experience in Should have working experience on any DevOps tools like GitHub, Jenkins, Cloud Native, etc & on semi-structured data formats like JSON, Parquet and/or XML files & written complex SQL queries for data analysis and extraction Should have in depth understanding on Data Warehousing, Data Analysis, Data Profiling, Data Quality & Data Mapping Education: B.Tech. /B.E. in Computer Science or related field. Certifications: Google Cloud Professional Data Engineer Certification. Roles and Responsibilities Analyze the different source systems, profile data, understand, document & fix Data Quality issues Gather requirements and business process knowledge in order to transform the data in a way that is geared towards the needs of end users Write complex SQLs to extract & format source data for ETL/data pipeline Create design documents, Source to Target Mapping documents and any supporting documents needed for deployment/migration Design, Develop and Test ETL/Data pipelines Design & build metadata-based frameworks needs for data pipelines Write Unit Test cases, execute Unit Testing and document Unit Test results Deploy ETL/Data pipelines Use DevOps tools to version, push/pull code and deploy across environments Support team during troubleshooting & debugging defects & bug fixes, business requests, environment migrations & other adhoc requests Do production support, enhancements and bug fixes Work with business and technology stakeholders to communicate EDW incidents/problems and manage their expectations Leverage ITIL concepts to circumvent incidents, manage problems and document knowledge Perform data cleaning, transformation, and validation to ensure accuracy and consistency across various data sources Stay current on industry best practices and emerging technologies in data analysis and cloud computing, particularly within the GCP ecosystem

Posted 2 months ago

Apply

5.0 - 10.0 years

18 - 25 Lacs

Sholinganallur

Hybrid

Skills Required:Big Query,, BigTable, Data Flow, Pub/Sub, Data fusion, Dataproc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Function, App Engine, AIRFLOW, Cloud Storage, BigTable, Cloud Spanner Skills Preferred:ETL Experience Required:• 5+ years of experience in data engineering, with a focus on data warehousing and ETL development (including data modelling, ETL processes, and data warehousing principles). • 5+ years of SQL development experience • 3+ years of Cloud experience (GCP preferred) with solutions designed and implemented at production scale. • Strong understanding and experience of key GCP services, especially those related to data processing (Batch/Real Time) leveraging Terraform, BigQuery, Dataflow, DataFusion, Dataproc, Cloud Build, AirFlow, and Pub/Sub, alongside and storage including Cloud Storage, Bigtable, Cloud Spanner • Experience developing with micro service architecture from container orchestration framework. • Designing pipelines and architectures for data processing • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines. • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team • Strong evidence of self-motivation to continuously develop own engineering skills and those of the team. • Proven record of working autonomously in areas of high ambiguity, without day-to-day supervisory support • Evidence of a proactive mindset to problem solving and willingness to take the initiative. • Strong prioritization, co-ordination, organizational and communication skills, and a proven ability to balance workload and competing demands to meet deadlines Thanks & Regards, Varalakshmi V 9019163564

Posted 2 months ago

Apply

5.0 - 7.0 years

15 - 20 Lacs

Thiruvananthapuram

Work from Office

Role Proficiency: JD for SAP BODS Data Engineer Strong proficiency in designing, developing, and implementing robust ETL solutions using SAP Business Objects Data Services (BODS). with strong EDW experience Strong proficiency in SAP BODS development, including job design, data flow creation, scripting, and debugging. Design and develop ETL processes using SAP BODS to extract, transform, and load data from various sources. Create and maintain data integration workflows, ensuring optimal performance and scalability. Solid understanding of data integration, ETL concepts, and data warehousing principles. Proficiency in SQL for data querying and manipulation. Familiarity with data modeling concepts and database systems. Excellent problem-solving skills and attention to detail. Strong communication and interpersonal skills for effective collaboration. Ability to work independently and manage multiple tasks simultaneously. 3+ experience relevant ETL development (SAPBODS) Required Skills Data Warehousing,Sap Bods,Etl,Edw

Posted 2 months ago

Apply

3.0 - 5.0 years

5 - 5 Lacs

Kochi, Thiruvananthapuram

Work from Office

Role Proficiency: Independently develops error free code with high quality validation of applications guides other developers and assists Lead 1 - Software Engineering Outcomes: Understand and provide input to the application/feature/component designs; developing the same in accordance with user stories/requirements. Code debug test document and communicate product/component/features at development stages. Select appropriate technical options for development such as reusing improving or reconfiguration of existing components. Optimise efficiency cost and quality by identifying opportunities for automation/process improvements and agile delivery models Mentor Developer 1 - Software Engineering and Developer 2 - Software Engineering to effectively perform in their roles Identify the problem patterns and improve the technical design of the application/system Proactively identify issues/defects/flaws in module/requirement implementation Assists Lead 1 - Software Engineering on Technical design. Review activities and begin demonstrating Lead 1 capabilities in making technical decisions Measures of Outcomes: Adherence to engineering process and standards (coding standards) Adherence to schedule / timelines Adhere to SLAs where applicable Number of defects post delivery Number of non-compliance issues Reduction of reoccurrence of known defects Quick turnaround of production bugs Meet the defined productivity standards for project Number of reusable components created Completion of applicable technical/domain certifications Completion of all mandatory training requirements Outputs Expected: Code: Develop code independently for the above Configure: Implement and monitor configuration process Test: Create and review unit test cases scenarios and execution Domain relevance: Develop features and components with good understanding of the business problem being addressed for the client Manage Project: Manage module level activities Manage Defects: Perform defect RCA and mitigation Estimate: Estimate time effort resource dependence for one's own work and others' work including modules Document: Create documentation for own work as well as perform peer review of documentation of others' work Manage knowledge: Consume and contribute to project related documents share point libraries and client universities Status Reporting: Report status of tasks assigned Comply with project related reporting standards/process Release: Execute release process Design: LLD for multiple components Mentoring: Mentor juniors on the team Set FAST goals and provide feedback to FAST goals of mentees Skill Examples: Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Develop user interfaces business software components and embedded software components 5 Manage and guarantee high levels of cohesion and quality6 Use data models Estimate effort and resources required for developing / debugging features / components Perform and evaluate test in the customer or target environment Team Player Good written and verbal communication abilities Proactively ask for help and offer help Knowledge Examples: Appropriate software programs / modules Technical designing Programming languages DBMS Operating Systems and software platforms Integrated development environment (IDE) Agile methods Knowledge of customer domain and sub domain where problem is solved Additional Comments: UST is looking for Java Senior developers to build end to end business solutions and to work with one of the leading financial services organization in the UK. The ideal candidate must possess strong background on frontend and backend development technologies. The candidate must possess excellent written and verbal communication skills with the ability to collaborate effectively with domain experts and technical experts in the team. Responsibilities: As a Java developer, you will - Maintain active relationships with Product Owner to understand business requirements, lead requirement gathering meetings and review designs with the product owner - Own backlog items and coordinate with other team members to develop the features planned for each sprint - Perform technical design reviews and code reviews - Mentor, Lead and Guide the team on technical skills - Be Responsible for prototyping, developing, and troubleshooting software in the user interface or service layers - Perform peer reviews on source code to ensure reuse, scalability and the use of best practices - Participate in collaborative technical discussions that focus on software user experience, design, architecture, and development - Perform demonstrations for client stakeholders on project features and sub features, which utilizes the latest Front end and Backend development technologies Requirements: - 5+ years of experience in Java/JEE development - Skills in developing applications using multi-tier architecture - 2+ years of experience in GCP service development is preferred - Skills in developing applications in GCP is preferred - Should be an expert in Cloud Composer, Data Flow, Dataproc, Cloud pub/sub, DAG creation - Python scripting knowledge is preferred - Apache Beam knowledge is mandatory - Java/JEE, Spring, Spring boot, REST/SOAP web services, Hibernate, SQL, Tomcat, Application servers (WebSphere), SONAR, Agile, AJAX, Jenkins... - Skills in UML, application designing/architecture, Design Patterns.. - Skills in Unit testing application using Junit or similar technologies - Capability to support QA teams with test plans, root cause analysis and defect fixing - Strong experience in Responsive design, cross browser web applications - Strong knowledge of web service models - Strong knowledge in creating and working with APIs - Experience with Cloud services, specifically on Google cloud - Strong exposure in Agile, Scaled Agile based development models - Familiar with Interfaces such as REST web services, swagger profiles, JSON payloads. - Familiar with tools/utilities such as Bitbucket / Jira / Confluence. Required Skills Java,Spring ,Spring Boot,Microservices

Posted 2 months ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings from Getronics! Solid experience designing, building, and maintaining cloud-based data platforms and infrastructure. Deep proficiency in GCP Cloud Services, including significant experience with Big Query, Cloud Storage, Data Proc, APIGEE, Cloud Run, Google Kubernetes Engine (GKE), Postgres, Artifact Registry, Secret Manager, and Access Management (IAM). Hands-on experience implementing and managing CI/CD Pipelines using tools like Tekton and potentially Astronomer. Strong experience with Job Scheduling and workflow orchestration using Airflow. Proficiency with Version Control systems, specifically Git. Strong programming skills in Python. Expertise in SQL and experience with relational databases like SQL Server, MY SQL, Postgres SQL. Experience with or knowledge of data visualization tools like Power BI. Familiarity with code quality and security scanning tools such as FOSSA and SonarQube. Foundational Knowledge on Artificial Intelligence and Machine Learning concepts and workflows. problem-solving skills and the ability to troubleshoot complex distributed systems. Strong communication and collaboration skills. Knowledge of other cloud providers (AWS, Azure, GCP)Skills Required:GCP , Big Query,, AI/ML Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering/AIML Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 2 months ago

Apply

8.0 - 13.0 years

14 - 24 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 8+ Years in IT and minimum 4+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 4+ years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements • Collaborate with cross-functional teams to understand data requirements and design scalable solutions that meet business needs. • Implement security measures and data governance policies to ensure the integrity and confidentiality of data. • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 8+ years of professional experience in: o Data engineering, data product development and software product launches. - 4+ years of cloud data engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 2 months ago

Apply

4.0 - 8.0 years

10 - 19 Lacs

Chennai

Hybrid

Greetings from Getronics! We have permanent opportunities for GCP Data Engineers in Chennai Location . Company : Getronics (Permanent role) Client : Automobile Industry Experience Required : 4+ Years in IT and minimum 3+ years in GCP Data Engineering Location : Chennai (Elcot - Sholinganallur) Work Mode : Hybrid Position Description: We are currently seeking a seasoned GCP Cloud Data Engineer with 3 to 5 years of experience in leading/implementing GCP data projects, preferrable implementing complete data centric model. This position is to design & deploy Data Centric Architecture in GCP for Materials Management platform which would get / give data from multiple applications modern & Legacy in Product Development, Manufacturing, Finance, Purchasing, N-Tier supply Chain, Supplier collaboration Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Storage Transfer Service, Cloud Data Fusion, Pub/Sub, Data flow, Cloud compression, Cloud scheduler, Gutil, FTP/SFTP, Dataproc, BigTable etc. • Build ETL pipelines to ingest the data from heterogeneous sources into our system • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements infrastructure. Skill Required: - GCP Data Engineer, Hadoop, Spark/Pyspark, Google Cloud Platform (Google Cloud Platform) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. - 4+ years of professional experience in: o Data engineering, data product development and software product launches. - 3+ years of cloud data/software engineering experience building scalable, reliable, and cost- effective production batch and streaming data pipelines using: Data warehouses like Google BigQuery. Workflow orchestration tools like Airflow. Relational Database Management System like MySQL, PostgreSQL, and SQL Server. Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub. Education Required: Any Bachelors' degree Candidate should be willing to take GCP assessment (1-hour online video test) LOOKING FOR IMMEDIATE TO 30 DAYS NOTICE CANDIDATES ONLY. Thanks, Durga.

Posted 2 months ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Kolkata

Work from Office

Execute high-quality visual designs for various documentation projects in PPT & Word. Collaborate with senior designers, team members, and other streams to understand design requirements and contribute effectively to design projects. Stay up-to-date with industry trends, software updates, and new technologies to improve your efficiency and productivity. Adapt designs based on feedback from peers, senior designers, and stakeholders to refine and enhance the final product. Assist in the preparation of design presentations, and word to communicate ideas effectively to clients or internal stakeholders. Ensure that all visual designs are of high quality and adhere to design standards and guidelines. Maintain a high level of deliverables in accordance with directives from the stream lead or design lead. Prioritize tasks and manage your workload efficiently to meet project deadlines while maintaining design quality. Primary Skill Ability to manage multiple projects simultaneously under tight deadlines. Microsoft Word and PowerPoint proficiency is preferred, although having working knowledge of MS Excel, Photoshop, Illustrator, InDesign, is also beneficial. Secondary Skill Experience working with advertising agencies or branding companies. Good communication skills. Skills (competencies)

Posted 2 months ago

Apply

8.0 - 13.0 years

3 - 6 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, DBT, Snaplogic, and ETL tools. SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency. Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability. Database Management: Manage and maintain SQL Server and PostgreSQL databases. ETL Processes: Develop and manage ETL processes to support data warehousing and analytics. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes. Troubleshooting: Identify and resolve data-related issues and discrepancies. Python Scripting: Utilize Python for data manipulation, automation, and integration tasks. Technical Skills: Proficiency in Snowflake, DBT, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory. Strong SQL skills with the ability to write and optimize complex queries. Knowledge of Python for data manipulation and automation. Knowledge of data governance frameworks and best practices Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Positive attitude and ability to work well in a team environment. Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus.

Posted 2 months ago

Apply

16.0 - 25.0 years

15 - 20 Lacs

Bengaluru

Work from Office

Skill required: Tech for Operations - Technological Innovation Designation: Program & Project Mgmt Senior Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions. The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent Automation In Technology Innovation, you will be working on the scientific field of innovation studies which serves to explain the nature and rate of technological change. You will have to understand new products, processes and significant technological changes of products and processes. What are we looking for In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation

Posted 2 months ago

Apply

3.0 - 5.0 years

15 - 20 Lacs

Gurugram

Work from Office

Job Summary : We are seeking a skilled and detail-oriented Salesforce CRM Analytics Specialist with 3 - 5 years of experience in Salesforce CRM Analytics, particularly Einstein Discovery. In this role, you will transform CRM data into actionable insights, enabling strategic business decisions across sales, marketing, and customer service. The ideal candidate will have a strong analytical mindset, hands-on experience in building predictive models, and a deep understanding of Salesforce Analytics Cloud (formerly Tableau CRM). Key Responsibilities: • Design, build, and deploy analytics dashboards and data visualizations using CRM Analytics (Tableau CRM). • Leverage Einstein Discovery to build predictive and prescriptive models that support sales performance, customer behaviour, and marketing effectiveness. • Collaborate with cross-functional teams including Sales, Marketing, Customer Service, and Business Intelligence to translate business requirements into analytical solutions. • Conduct data preparation and ETL processes within Salesforce using dataflows, recipes, and connectors. • Develop and maintain scalable data models and datasets optimized for analytics use cases. • Monitor model performance and interpret findings, ensuring high accuracy and relevance of insights. • Create and maintain documentation for analytical workflows, model assumptions, and business logic. • Drive adoption of analytics solutions through training, presentations, and ongoing support to stakeholders. Preferred candidate profile: • 3 - 5 years of hands-on experience with Salesforce CRM Analytics (Tableau CRM / Einstein Analytics). • Proven experience with Einstein Discovery: creating predictive models, interpreting results, and implementing model outputs. • Proficiency in dataflows, SAQL, and recipe-based transformations. • Strong SQL skills and understanding of relational databases. • Excellent analytical, problem-solving, and communication skills. Preferred : • Salesforce certifications such as Tableau CRM & Einstein Discovery Consultant. • Experience with Salesforce Sales Cloud or Industries Cloud integration

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies