Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
3 - 7 Lacs
Pune
Work from Office
Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Electronic Medical Records (EMR) Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Support Engineer, you will act as software detectives, providing a dynamic service that identifies and resolves issues within various components of critical business systems. Your typical day will involve collaborating with team members to troubleshoot software problems, analyzing system performance, and ensuring that applications run smoothly to support business operations effectively. You will engage with users to understand their challenges and work towards implementing solutions that enhance system functionality and user experience. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of processes and procedures to enhance team knowledge.- Engage with stakeholders to gather requirements and provide feedback on system performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in Electronic Medical Records (EMR).- Strong analytical skills to diagnose and resolve software issues.- Experience with troubleshooting and debugging software applications.- Familiarity with system integration and data flow management.- Ability to communicate technical information effectively to non-technical users. Additional Information:- The candidate should have minimum 3 years of experience in Electronic Medical Records (EMR).- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
7.0 - 12.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, manage project timelines, and contribute to the overall success of application development initiatives. Roles & Responsibilities:1.Serve as a client-facing technical lead, working closely with stakeholders to gather requirements and translate them into actionable ETL solutions.2.Design and develop new stored procedures in MS SQL Server, with a strong focus on performance and maintainability.3.Build/enhance SSIS packages, implementing best practices for modularity, reusability, and error handling.4.Architect and design ETL workflows, including staging, cleansing, data masking, transformation, and loading strategies.5.Implement comprehensive error handling and logging mechanisms to support reliable, auditable data pipelines.6.Design and maintain ETL-related tables, including staging, audit/logging, and dimensional/historical tables.7.Work with Snowflake to build scalable cloud-based data integration and warehousing solutions.8.Reverse-engineer and optimize existing ETL processes and stored procedures for better performance and maintainability.9.Troubleshoot job failures, data discrepancies in Production Professional & Technical Skills: 1.7+ years of experience in Data Warehousing [MS SQL, Snowflake], MS SQL Server (T-SQL, stored procedures, indexing, performance tuning).2.Proven expertise in SSIS package development, including parameterization, data flow, and control flow design.3.Strong experience in ETL architecture, including logging, exception handling, and data validation.4.Proficient in data modeling for ETL, including staging, target, and history tables.5.Hands-on experience with Snowflake, including data loading, transformation scripting, and optimization.6.Ability to manage historical data using SCDs, auditing fields, and temporal modeling.7.Set up Git repositories, define version control standards, and manage code branching/release. DevOps, and CI/CD practices for data pipelines.8.Ability to work independently while managing multiple issues and deadlines.9.Excellent communication skills, both verbal and written, with demonstrated client interaction.Would be a Plus:10.DW migration from MS SQL to Snowflake.11.Experience with modern data integration tools such as Matillion.12.Knowledge of BI tools like Tableau. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
2.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Axway API Management Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Axway API Management Platform.- Strong understanding of API design and development principles.- Experience with application integration and data flow management.- Familiarity with cloud-based services and deployment strategies.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Axway API Management Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
2.0 - 4.0 years
4 - 8 Lacs
Pune
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with data modeling and database design.- Strong understanding of ETL processes and data integration techniques.- Familiarity with cloud platforms and data storage solutions.- Experience in performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 5 years of experience in Snowflake Data Warehouse.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Oracle Procedural Language Extensions to SQL (PLSQL) Good to have skills : Google BigQuery, Google Cloud Platform ArchitectureMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will engage in the design, development, and maintenance of data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are robust, scalable, and aligned with business objectives. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and optimize data pipelines to ensure efficient data flow and processing.- Monitor and troubleshoot data quality issues, implementing corrective actions as necessary. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Procedural Language Extensions to SQL (PLSQL).- Good To Have Skills: Experience with Google BigQuery, Google Cloud Platform Architecture.- Strong understanding of ETL processes and data integration techniques.- Experience with data modeling and database design principles.- Familiarity with data warehousing concepts and best practices. Additional Information:- The candidate should have minimum 3 years of experience in Oracle Procedural Language Extensions to SQL (PLSQL).- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
7.0 - 11.0 years
13 - 18 Lacs
Pune
Work from Office
Project Role : Data Architect Project Role Description : Define the data requirements and structure for the application. Model and design the application data structure, storage and integration. Must have skills : Apache Kafka Good to have skills : Data AnalyticsMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Architect, you will define the data requirements and structure for the application. Your typical day will involve modeling and designing the application data structure, storage, and integration, ensuring that the architecture aligns with business needs and technical specifications. You will collaborate with various teams to ensure that data flows seamlessly across systems, contributing to the overall efficiency and effectiveness of data management within the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Develop and maintain documentation related to data architecture and design. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Kafka.- Good To Have Skills: Experience with Data Analytics.- Strong understanding of data modeling techniques and best practices.- Experience with data integration tools and methodologies.- Familiarity with cloud-based data storage solutions. Additional Information:- The candidate should have minimum 5 years of experience in Apache Kafka.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
kochi, kerala
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS Data and Analytics (D&A) OBIEE - Senior The opportunity We're looking for a Senior expert in Data analytics to create and manage large BI and analytics solutions using Visualization Tools such as OBIEE/OAC that turn data into knowledge. In this role, you should have a background in data and business analysis. You should be analytical and an excellent communicator. Having business acumen and problem-solving aptitude would be a plus. Your key responsibilities You need to work as a team member and Lead to contribute to various technical streams of OBIEE/OAC implementation projects. Provide product and design level technical best practices. Interface and communicate with the onsite coordinators. Completion of assigned tasks on time and regular status reporting to the lead. Skills and attributes for success Use an issue-based approach to deliver growth, market, and portfolio strategy engagements for corporates. Strong communication, presentation, and team-building skills and experience in producing high-quality reports, papers, and presentations. Exposure to BI and other visualization tools in the market. Building a quality culture. Foster teamwork. Participating in the organization-wide people initiatives. To qualify for the role, you must have BE/BTech/MCA/MBA with adequate industry experience. Should have at least around 3 to 7 years of experience in OBIEE/OAC. Experience in Working with OBIEE, OAC end-to-end implementation. Understanding ETL/ELT Process using tools like Informatica/ODI/SSIS. Should have knowledge of reporting, dashboards, RPD logical modeling. Experience on BI Publisher. Experience on Agents. Experience in Security implementation in OAC/OBIEE. Ability to manage self-service data preparation, data sync, data flow, and working with curated data sets. Manage connections to multiple data sources - cloud, non-cloud using available various data connector with OAC. Experience in creating pixel-perfect reports, manage contents in the catalog, dashboards, prompts, calculations. Ability to create a dataset, map layers, multiple data visualizations, a story in OAC. Good understanding of various data models e.g. snowflakes, data marts, star data models, data lakes, etc. Excellent written and verbal communication. Having Cloud experience is an added advantage. Migrating OBIEE on-premise to Oracle analytics in the cloud. Knowledge and working experience with Oracle autonomous database. Strong knowledge in DWH concepts. Strong data modeling skills. Familiar with Agile and Waterfall SDLC processes. Strong SQL/PLSQL with analytical skills. Ideally, you'll also have Experience in Insurance and Banking domains. Strong hold in project delivery and team management. Excellent written and verbal communication skills. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 1 month ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
You will be working as a Monitoring Team Lead for a Data Pipeline L1 team, overseeing the daily operations to ensure the health and stability of data pipelines, and managing incident response. Your role will involve leading the team, monitoring performance, and escalating issues as needed. As a Team Leader, you will guide and mentor the L1 monitoring team to ensure proficiency in data pipeline monitoring, troubleshooting, and escalation procedures. You will manage team performance, distribute tasks effectively, and resolve conflicts. Acting as a point of contact for the team, you will represent them to stakeholders and advocate for their needs. Your responsibilities will also include developing team strengths and promoting a positive work environment. In terms of Data Pipeline Monitoring, you will continuously monitor data pipelines for performance, availability, and data quality issues. Utilizing monitoring tools, you will detect and analyze alerts related to data pipelines to ensure data freshness, completeness, accuracy, consistency, and validity. For Incident Management, you are required to detect, log, categorize, and track incidents within the ticketing system. Any unresolved issues should be escalated to L2/L3 teams based on predefined SLAs and severity. You will also coordinate with other teams to resolve incidents quickly and efficiently while ensuring proper communication and updates to relevant stakeholders throughout the incident lifecycle. Managing Service Level Agreements (SLAs) related to data pipeline monitoring and incident response will be essential. You will monitor and ensure that the team meets or exceeds established SLAs. Process Improvement is another key aspect where you will identify opportunities to enhance monitoring processes, automation, and efficiency. Implementing best practices for data pipeline monitoring and incident management and conducting regular reviews of service performance are part of your responsibilities. Your role will also involve providing technical expertise to the team, staying updated on industry best practices and new technologies related to data pipelines and monitoring. Maintaining and updating documentation related to data pipeline monitoring processes, procedures, and escalation paths is crucial. Accurate shift handovers to the next shift, with updates on ongoing issues, will also be expected. Qualifications: - Proven experience in data pipeline monitoring and incident management. - Strong understanding of data pipeline concepts, including ingestion, transformation, and storage. - Experience with monitoring tools and technologies. - Excellent communication, interpersonal, and leadership skills. - Ability to work independently and as part of a team in a fast-paced environment. - Experience with cloud services (AWS, Azure, or GCP) is a plus. - Knowledge of data governance principles and practices is beneficial. Skills to be evaluated on: - Data Operation/Operations Team Lead. Mandatory Skills: - Data Operation, Operations Team Lead. Desirable Skills: - Lead Operations, data operations, operations management, team management.,
Posted 1 month ago
4.0 - 7.0 years
9 - 13 Lacs
Pune
Work from Office
skilled Java + GCP Developer Shell scripting and Python, Java, Spring Boot, BigQuery. The ideal candidate should have hands-on experience in Java, Spring Boot, and Google Cloud Platform (GCP)
Posted 1 month ago
5.0 - 7.0 years
5 - 14 Lacs
Pune, Gurugram, Bengaluru
Work from Office
• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts
Posted 1 month ago
5.0 - 8.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Skills desired: Strong at SQL (Multi pyramid SQL joins) Python skills (FastAPI or flask framework) PySpark Commitment to work in overlapping hours GCP knowledge(BQ, DataProc and Dataflow) Amex experience is preferred(Not Mandatory) Power BI preferred (Not Mandatory) Flask, Pyspark, Python, Sql
Posted 1 month ago
4.0 - 8.0 years
6 - 11 Lacs
Hyderabad, Pune
Work from Office
NetSuite Functional Consultant1 Job Responsibilities: Configure/customize NetSuite application to meet customers business requirements. Conduct personalization sessions and document with meeting minute summaries. Demonstrated experience in participating and translating customer business requirements into Business solutions, either as a software solution or a re-engineering initiative Collaborate with technical team member(s) to help guide the development of customized solutions, or data extracts using SQL queries Identify test scenarios, establish test cases and support SIT, UAT with core client stakeholders to ensure system configuration objectives have been met Create training/support documentation, and drive end-user training to promote user adoption Documentation of requirement, Process and User documentation Design business process and application configuration for application based on industry best practices. Support the Go Live deployment processes, ensuring a seamless software launch and continuity of business operations during cutover Responsible for owning and delivering complex solutions using Oracle NetSuite platform. Software-testing and Conduct testing of all kinds and prepare test cases of the modules implemented and developed. Suggest process improvements based on application capability and industry best practices. Responsible for NetSuite Setups Customer, Vendor, and Item Department, Class, Locations NetSuite Processes Order to Cash Procure to Pay Bank Reconciliation Accounting Advanced Revenue management Fixed Asset Intercompany Management Call to Resolution (Case Management) Form Customizations & Fields Creation Custom Records CSV Imports Work-Flows setup Saved Searches & Report Customization Integration process mapping Skills & Experience Required: 8+ yrs of hands on experience in NetSuite Implementation & Enhancement projects Thorough knowledge of NetSuite functionalities and architecture Hands-on experience on NetSuite Integration with 3rd party applications. Should have min 4 end to end implementation experience. Strong communication skills to Work closely with customers and partners to gather requirements and design solutions. Strong NetSuite ERP Knowledge and experience. Setups and Configurations, Saved Searches and reports. The mandatory requirement is to have functional experience in Receivables, Order Management ,case management and billing operations within NetSuite Excellent command on flowcharts, Data flow Diagrams Strong analytical and problem-solving skills, Good team player and collaborate with other team Ready to be on-call on a rotational basis. Excellent command on google sheet, google apps, word, excel, PowerPoint.
Posted 1 month ago
10.0 - 14.0 years
0 Lacs
haryana
On-site
An experienced consulting professional who understands solutions, industry best practices, multiple business processes, or technology designs within a product/technology family. You will operate independently to provide quality work products to an engagement, performing varied and complex duties and tasks that require independent judgment to implement Oracle products and technology to meet customer needs. You will apply Oracle methodology, company procedures, and leading practices, demonstrating expertise to deliver functional and technical solutions on moderately complex customer engagements. In addition, you may act as the team lead on projects and effectively consult with management of customer organizations. You will participate in business development activities, develop and configure detailed solutions for moderately complex projects, and should have 10-12 years of relevant experience for this position. Effective communication, building rapport with team members and clients, as well as the ability to travel as needed are essential skills for this role. The responsibilities of the candidate include having 10 to 12 years of expert domain knowledge in HCM covering the hire to retire cycle. The candidate must have participated in at least 5 end-to-end HCM implementations, with at least 2 involving HCM Cloud. The candidate should possess expertise in areas such as Time and Labor, Absence Management, Talent, Benefits, Compensation, Recruiting (ORC), and Core HR, along with an in-depth understanding of HCM Cloud business processes and their data flow. Furthermore, the candidate should have experience in client-facing roles, interacting with customers in requirement gathering workshops, design, configuration, testing, and go-live processes. Strong written and verbal communication skills, personal drive, flexibility, team player mindset, problem-solving abilities, influencing and negotiating skills, organizational awareness and sensitivity, engagement delivery, continuous improvement, knowledge sharing, and client management are all crucial attributes. Leadership capabilities, planning, follow-up skills, mentorship, work allocation, monitoring, and providing status updates to the Project Manager are also required. The candidate should be prepared for domestic or international travel for short as well as long durations. As an IC3 level career professional, you will be part of a global team at Oracle, a world leader in cloud solutions. With a commitment to inclusive workforce growth and opportunities for all, Oracle offers competitive benefits, flexible medical, life insurance, and retirement options, as well as volunteer programs to give back to communities. Oracle welcomes individuals with disabilities and is dedicated to including them at all stages of the employment process, offering accessibility assistance or accommodation by reaching out via email at accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States.,
Posted 1 month ago
12.0 - 15.0 years
35 - 60 Lacs
Chennai, Bengaluru
Hybrid
Job Description: Job Title: GCP Solution Architect Location : Chennai | Bangalore Experience : 12-15 years in IT Key Responsibilities Architect and lead GCP-native data and AI solutions tailored to AdTech use casessuch as real-time bidding, campaign analytics, customer segmentation, and look alike modeling. Design high-throughput data pipelines, audience data lakes, and analytics platforms leveraging GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, Vertex AI, etc. Collaborate with ad operations, marketing teams, and digital product owners to understand business goals and translate them into scalable and performant solutions. Integrate with third-party AdTech and MarTech platforms, including DSPs, SSPs, CDPs, DMPs, ad exchanges, and identity resolution systems. Ensure architectural alignment with data privacy regulations (GDPR, CCPA) and support consent management and data anonymization strategies. Drive technical leadership across multi-disciplinary teams (Data Engineering, MLOps, Analytics) and enforce best practices in data governance, model deployment, and cloud optimization. Lead discovery workshops, solution assessments, and architecture reviews during pre-sales and delivery cycles. Required Skills & Qualifications Bachelors or Masters degree in Computer Science, Engineering, or related field. BigQuery, Cloud Pub/Sub, Dataflow, Dataproc, Cloud Composer (Airflow), Vertex AI, AI Platform, AutoML, Cloud Functions, Cloud Run, Looker, Apigee, Dataplex, GKE Deep understanding of programmatic advertising (RTB, OpenRTB), cookie-less identity frameworks, and AdTech/MarTech data flows. Experience integrating or building components like: Data Management Platforms (DMPs) Customer Data Platforms (CDPs) Demand-Side Platforms (DSPs) Ad servers, attribution engines, and real-time bidding pipelines Event-driven and microservices architecture using APIs, streaming pipelines, and edge delivery networks. Integration with platforms like Google Marketing Platform, Google Ads Data Hub, Snowplow, Segment, or similar. Strong understanding of IAM, data encryption, PII anonymization, and regulatory compliance (GDPR, CCPA, HIPAA if applicable). Experience with CI/CD pipelines (Cloud Build), Infrastructure as Code (Terraform), and MLOps pipelines using Vertex AI or Kubeflow. Strong experience in Python and SQL; familiarity with Scala or Java is a plus. Experience with version control (Git), Agile delivery, and architectural documentation tools. If you know someone suitable, feel free to forward their resume to aarthi.murali@zucisystems.com. Regards, Aarthi Murali
Posted 1 month ago
4.0 - 9.0 years
20 - 35 Lacs
Gurugram
Work from Office
Job Description - The candidate should have extensive production experience (2+ Years ) in GCP - Strong background in Data engineering 2-3 Years of exp in Big Data technologies including, Hadoop, NoSQL, Spark, Kafka etc. - Exposure to enterprise application development is a must. Roles & Responsibilities 4-10 years of IT experience range is preferred. Able to effectively use GCP managed services e.g. Dataproc, Dataflow, pub/sub, Cloud functions, Big Query, GCS - At least 4 of these Services. Good to have knowledge on Cloud Composer, Cloud SQL, Big Table, Cloud Function. Strong experience in Big Data technologies Hadoop, Sqoop, Hive and Spark including DevOPs. Good hands on expertise on either Python or Java programming. Good Understanding of GCP core services like Google cloud storage, Google compute engine, Cloud SQL, Cloud IAM. Good to have knowledge on GCP services like App engine, GKE, Cloud Run, Cloud Built, Anthos. Ability to drive the deployment of the customers workloads into GCP and provide guidance, cloud adoption model, service integrations, appropriate recommendations to overcome blockers and technical road-maps for GCP cloud implementations. Experience with technical solutions based on industry standards using GCP - IaaS, PaaS and SaaS capabilities. Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Act as a subject-matter expert OR developer around GCP and become a trusted advisor to multiple teams. Technical ability to become certified in required GCP technical certifications.
Posted 1 month ago
5.0 - 10.0 years
25 - 35 Lacs
Noida, Pune, Bengaluru
Work from Office
Description: We are seeking a proficient Data Governance Engineer to lead the development and management of robust data governance frameworks on Google Cloud Platform (GCP). The ideal candidate will bring in-depth expertise in data management, metadata frameworks, compliance, and security within cloud environments to ensure high-quality, secure, and compliant data practices aligned with organizational goals. Requirements: 4+ years of experience in data governance, data management, or data security. Hands-on experience with Google Cloud Platform (GCP) including BigQuery, Dataflow, Dataproc, and Google Data Catalog. Strong command over metadata management, data lineage, and data quality tools (e.g., Collibra, Informatica). Deep understanding of data privacy laws and compliance frameworks. Proficiency in SQL and Python for governance automation. Experience with RBAC, encryption, and data masking techniques. Familiarity with ETL/ELT pipelines and data warehouse architectures. Job Responsibilities: Develop and implement comprehensive data governance frameworks , focusing on metadata management, lineage tracking , and data quality. Define, document, and enforce data governance policies, access control mechanisms, and security standards using GCP-native services such as IAM, DLP, and KMS. Manage metadata repositories using tools like Collibra, Informatica, Alation, or Google Data Catalog. Collaborate with data engineering and analytics teams to ensure compliance with GDPR, CCPA, SOC 2, and other regulatory standards. Automate processes for data classification, monitoring, and reporting using Python and SQL. Support data stewardship initiatives including the development of data dictionaries and governance documentation. Optimize ETL/ELT pipelines and data workflows to meet governance best practices. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 1 month ago
4.0 - 7.0 years
18 - 20 Lacs
Pune
Hybrid
Job Title: GCP Data Engineer Location: Pune, India Experience: 4 to 7 Years Job Type: Full-Time Job Summary: We are looking for a highly skilled GCP Data Engineer with 4 to 7 years of experience to join our data engineering team in Pune . The ideal candidate should have strong experience working with Google Cloud Platform (GCP) , including Dataproc , Cloud Composer (Apache Airflow) , and must be proficient in Python , SQL , and Apache Spark . The role involves designing, building, and optimizing data pipelines and workflows to support enterprise-grade analytics and data science initiatives. Key Responsibilities: Design and implement scalable and efficient data pipelines on GCP , leveraging Dataproc , BigQuery , Cloud Storage , and Pub/Sub. Develop and manage ETL/ELT workflows using Apache Spark , SQL , and Python. Orchestrate and automate data workflows using Cloud Composer (Apache Airflow). Build batch and streaming data processing jobs that integrate data from various structured and unstructured sources. Optimize pipeline performance and ensure cost-effective data processing. Collaborate with data analysts, scientists, and business teams to understand data requirements and deliver high-quality solutions. Implement and monitor data quality checks, validation, and transformation logic. Required Skills: Strong hands-on experience with Google Cloud Platform (GCP) Proficiency with Dataproc for big data processing and Apache Spark Expertise in Python and SQL for data manipulation and scripting Experience with Cloud Composer / Apache Airflow for workflow orchestration Knowledge of data modeling, warehousing, and pipeline best practices Solid understanding of ETL/ELT architecture and implementation Strong troubleshooting and problem-solving skills Preferred Qualifications: GCP Data Engineer or Cloud Architect Certification. Familiarity with BigQuery , Dataflow , and Pub/Sub. Interested candidates can send your your resume on pranitathapa@onixnet.com
Posted 1 month ago
4.0 - 8.0 years
20 - 35 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.
Posted 1 month ago
4.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
We are looking for a strong Oracle ERP Cloud technical consultant who thrives on solving complex business problems in reporting and data migration track. The ideal candidate should:. Be able to operate independently to provide quality work products perform varied and complex duties and tasks that need independent judgment. Have excellent communication skills - both written & verbal. Have good interpersonal skills with ability to build rapport with all stakeholders. Have ability to present ideas and solutions in a clear & concise manner. Be self-motivated with a lot of energy and drive. Have the ability and willingness to learn The ideal candidate should be Bachelor of Engineering/Bachelor of Technology or Master of Computer Applications with experience ranging from 4 to 10 years and should: . Have hands-on experience in data model of Oracle ERP Cloud and E-Business Suite (EBS) applications (Financials, Distribution, Manufacturing). Have experience (In-Depth Understanding of Data Model and Business process functionality and related data flow) in Oracle ERP Cloud applications (Finance or Supply chain). Have experience in SaaS technical components namely, BI Publisher, OTBI, FBDI etc. and in-depth knowledge in SQL, PLSQL. Have experience in writing efficient and optimized code and understanding of performance tuning techniques. Have experience in data migration from EBS to Oracle Cloud Career Level - IC2 Your Responsibilities As an integral part of the Oracle ERP Cloud Implementation team, you will be responsible for the following: . Working with remote and geographically distributed teams to enable building the right products, using the right building blocks and making them consumable by other products easily . Be very technically hands-on and own/drive key end to end product/services . Ensure customer success including delivering fixes/patches as needed . Help build high performance organization including referring, interviewing top talent to Oracle . Design & Development of reports and data migration for the customer implementation. . Translate business processes and requirements into technical requirements and designs . Participate proactively in Organization initiatives Career Level - IC2
Posted 1 month ago
5.0 - 10.0 years
15 - 20 Lacs
Madurai, Chennai
Work from Office
Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: GCP Data Engineer Location: Madurai Experience: 5+ Years Notice Period: Immediate Job Summary We are seeking a hands-on GCP Data Engineer with deep expertise in real-time streaming data architectures to help design, build, and optimize data pipelines in our Google Cloud Platform (GCP) environment. The ideal candidate will have strong architectural vision and be comfortable rolling up their sleeves to build scalable, low-latency streaming data pipelines using Pub/Sub, Dataflow (Apache Beam) , and BigQuery . Key Responsibilities Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub , Dataflow , and BigQuery . Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data. Work closely with stakeholders to understand data requirements and translate them into scalable designs. Optimize streaming pipeline performance, latency, and throughput. Build and manage orchestration workflows using Cloud Composer (Airflow) . Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets. Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring , Error Reporting , and Stackdriver . Experience with the data modeling. Ensure robust security, encryption, and access controls across all data layers. Collaborate with DevOps for CI/CD automation of data workflows using Terraform , Cloud Build , and Git . Document streaming architecture, data lineage, and deployment runbooks. Required Skills & Experience 5+ years of experience in data engineering or architecture. 3+ years of hands-on GCP data engineering experience. Strong expertise in: Google Pub/Sub Dataflow (Apache Beam) BigQuery (including streaming inserts) Cloud Composer (Airflow) Cloud Storage (GCS) Solid understanding of streaming design patterns , exactly-once delivery , and event-driven architecture . Deep knowledge of SQL and NoSQL data modeling. Hands-on experience with monitoring and performance tuning of streaming jobs. Experience using Terraform or equivalent for infrastructure as code. Familiarity with CI/CD pipelines for data workflows.
Posted 1 month ago
5.0 - 10.0 years
4 - 9 Lacs
Chennai, Bengaluru
Work from Office
Dear Candidate, This is with reference to your profile on the job portal. Deloitte India Consulting has an immediate requirement for the following role. Job Notice period: Looking for immediate 4 Weeks (Max) Location : Any Job Description – Skill : GCP Data Engineer Incase if you are interested, please share your updated resume along with the following details.(Mandatory) to Smouni@deloitte.com Candidate Name Mobile No. Email ID Skill Total Experience Education Details Current Location Requested location Current Firm Current CTC Exp CTC Notice Period/LWD Feedback
Posted 1 month ago
15.0 - 20.0 years
15 - 19 Lacs
Gurugram
Work from Office
Project Role : Technology Architect Project Role Description : Design and deliver technology architecture for a platform, product, or engagement. Define solutions to meet performance, capability, and scalability needs. Must have skills : SAP Sales and Distribution (SD) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Technology Architect, you will design and deliver technology architecture for a platform, product, or engagement. Your typical day will involve collaborating with various teams to define solutions that meet performance, capability, and scalability needs. You will engage in discussions to ensure that the architecture aligns with business objectives and technical requirements, while also addressing any challenges that arise during the development process. Your role will require you to stay updated with the latest technology trends and best practices to ensure that the solutions you propose are innovative and effective. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Evaluate and recommend new technologies to improve system performance. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD).- Strong understanding of system integration and data flow.- Experience with performance tuning and optimization techniques.- Familiarity with cloud-based solutions and architecture.- Ability to create detailed technical documentation and architecture diagrams. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP Sales and Distribution (SD).- This position is based at our Gurugram office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
3.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the optimization of data pipelines for improved performance and efficiency.- Collaborate with stakeholders to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Strong understanding of ETL processes and data integration techniques.- Experience with data modeling and database design principles.- Familiarity with data quality frameworks and best practices.- Knowledge of cloud data warehousing solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
15.0 - 20.0 years
4 - 8 Lacs
Chennai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Agile Project Management Good to have skills : Apache SparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in ensuring that data is accessible, reliable, and ready for analysis, contributing to informed decision-making across the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering practices.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Agile Project Management.- Good To Have Skills: Experience with Apache Spark, Google Cloud SQL, Python (Programming Language).- Strong understanding of data pipeline architecture and design principles.- Experience with ETL tools and data integration techniques.- Familiarity with data quality frameworks and best practices. Additional Information:- The candidate should have minimum 7.5 years of experience in Agile Project Management.- This position is based in Chennai (Mandatory).- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
16.0 - 25.0 years
14 - 19 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Product Development Management Designation: AI/ML Computational Science Sr Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationManage the end-to-end product development process from conception to design and production start-up, including the product structure design, engineering requirement process, multi-function resources collaboration and the engineering and supply chain integration. What are we looking for Experience in Product Management by applying Product Management principles Should have experience in Multiple domains in launching/acquiring new products/offerings Solid experience in working with client/customer management teams to achieve product objectivesShould have worked in envisioning, assessing, contracting, and onboarding products off the shelf for accelerating the goal of establishing a foothold.Work with other Product Managers and Functional Product owners to remove overlap and duplication of Functionality and Features across the organization.Decide on the prioritized features as per market, user, customer, and business requirements.Work closely with the Product Functional Owner to cull out the requirements for the functionality that is prioritized. Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualification Any Graduation
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
73564 Jobs | Dublin
Wipro
27625 Jobs | Bengaluru
Accenture in India
22690 Jobs | Dublin 2
EY
20638 Jobs | London
Uplers
15021 Jobs | Ahmedabad
Bajaj Finserv
14304 Jobs |
IBM
14148 Jobs | Armonk
Accenture services Pvt Ltd
13138 Jobs |
Capgemini
12942 Jobs | Paris,France
Amazon.com
12683 Jobs |