Home
Jobs
Companies
Resume

593 Dataflow Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

3 - 5 Lacs

Cochin

On-site

Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP. Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. Ensure data architecture aligns with business goals, governance, and compliance requirements. Collaborate with stakeholders to define data strategy and roadmap. Design and deploy BigQuery solutions for optimized performance and cost efficiency. Build and maintain ETL/ELT pipelines for large-scale data processing. Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Implement best practices for data security, privacy, and compliance in cloud environments. Integrate machine learning workflows with data pipelines and analytics tools. Define data governance frameworks and manage data lineage. Lead data modeling efforts to ensure consistency, accuracy, and performance across systems. Optimize cloud infrastructure for scalability, performance, and reliability. Mentor junior team members and ensure adherence to architectural standards. Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems. Conduct technical reviews, audits, and performance tuning for data solutions. Design solutions for multi-region and multi-cloud data architecture. Stay updated on emerging technologies and trends in data engineering and GCP. Drive innovation in data architecture, recommending new tools and services on GCP. Certifications : Google Cloud Certification is Preferred. Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments. Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. Strong experience in data warehousing, data lakes, and real-time data pipelines. Proficiency in SQL, Python, or other data processing languages. Experience with cloud security, data governance, and compliance frameworks. Strong problem-solving skills and ability to architect solutions for complex data environments. Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. Leadership experience and ability to mentor technical teams. Excellent communication and collaboration skills.

Posted 11 hours ago

Apply

3.0 - 10.0 years

5 - 18 Lacs

India

On-site

Overview: We are looking for a skilled GCP Data Engineer with 3 to 10 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities:  Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs.  Develop and maintain data ingestion frameworks and pipelines from various data sources using GCP services.  Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements.  Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows.  Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services.  Develop and implement data and semantic interoperability specifications.  Work closely with business teams to define and scope requirements.  Analyze existing systems to identify appropriate data sources and drive continuous improvement.  Implement and continuously enhance automation processes for data ingestion and data transformation.  Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines.  Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Skills and Qualifications:  Overall 3-10 years of hands-on experience as a Data Engineer, with at least 2-3 years of direct GCP Data Engineering experience.  Strong SQL and Python development skills are mandatory.  Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies.  Demonstrated knowledge and experience with Google Cloud BigQuery is a must.  Experience with DataProc and Dataflow is highly preferred.  Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks.  Extensive experience in SQL across various database platforms.  Any BI tools Experience is also preferred.  Experience in data mapping and data modeling.  Familiarity with data analytics tools and best practices.  Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell.  Practical experience with Google Cloud services including but not limited to: o BigQuery, BigTable o Cloud Dataflow, Cloud Dataproc o Cloud Storage, Pub/Sub o Cloud Functions, Cloud Composer o Cloud Spanner, Cloud SQL  Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark).  Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc.  Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP.  GCP Data Engineer Certification is highly preferred. Job Type: Full-time Pay: ₹500,298.14 - ₹1,850,039.92 per year Benefits: Health insurance Schedule: Rotational shift Work Location: In person

Posted 11 hours ago

Apply

0 years

12 - 20 Lacs

Gurgaon

Remote

Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana

Posted 11 hours ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes Show more Show less

Posted 11 hours ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Future-Able is looking for a Data Engineer, a full-time contract role, to work for: Naked & Thriving - An organic, botanical skincare brand committed to creating high-performing, naturally derived products that are as kind to the planet as they are to your skin. Our mission is to empower individuals to embrace sustainable self-care while nurturing their natural beauty. Every product we craft reflects our dedication to sustainability, quality, and transparency, ensuring our customers feel confident with every choice they make. As we rapidly grow and expand into new categories, channels, and countries, customer satisfaction remains our top priority. Job Summary: We are seeking a Data Engineer with expertise in Python, exposure to AI & Machine Learning, and a strong understanding of eCommerce analytics to design, develop, and optimize data pipelines. The ideal candidate will work on Google Cloud infrastructure, enabling advanced insights using Google Analytics (GA4). What You Will Do: ● Develop & maintain scalable data pipelines to support analytics and AI-driven models. ● Work with Python (or equivalent programming language) for data processing and transformation. ● Implement AI & Machine Learning techniques for predictive analytics and automation. ● Optimize eCommerce data insights using GA4 and Google Analytics to drive business decisions. ● Build cloud-based data infrastructure leveraging Google Cloud services like BigQuery, Pub/Sub, and Dataflow. ● Ensure data integrity and governance across structured and unstructured datasets. ● Collaborate with cross-functional teams including product managers, analysts, and marketing professionals. ● Monitor & troubleshoot data pipelines to ensure smooth operation and performance. We are looking for: ● Proficiency in Python or a similar language (e.g., Scala). ● Experience with eCommerce analytics and tracking frameworks. ● Expertise in Google Analytics & GA4 for data-driven insights. ● Knowledge of Google Cloud Platform (GCP), including BigQuery, Cloud Functions, and Dataflow. ● Experience in designing, building, and optimizing data pipelines using ETL frameworks. ● Familiarity with data warehousing concepts and SQL-based query optimization. ● Strong problem-solving and communication skills in a fast-paced environment. What will make you stand out: ● Experience with event-driven architecture for real-time data processing. ● Understanding of marketing analytics and attribution modeling. ● Previous work in a high-growth eCommerce environment. ● Exposure of AI & Machine Learning concepts and model deployment. Benefits: ● USD Salary. ● Fully Remote Work. ● USD 50 for health insurance payment. ● 30 days of pay time off per year. ● The possibility of being selected for annual bonuses based on business performance and personal achievements. Show more Show less

Posted 11 hours ago

Apply

5.0 - 7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes Show more Show less

Posted 11 hours ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Dear Job Seekers, Greetings from Voice Bay! We are currently hiring for Machine Learning Engineer , If you are interested, please submit your application. Please find below the JD for your consideration: Work Location – Hyderabad Exp – 4 – 10 Years Work Mode – 5 Days Work From Office Mandatory Key Responsibilities  Design, develop, and implement end-to-end machine learning models, from initial data exploration and feature engineering to model deployment and monitoring in production environments.  Build and optimize data pipelines for both structured and unstructured datasets, focusing on advanced data blending, transformation, and cleansing techniques to ensure data quality and readiness for modeling.  Create, manage, and query complex databases, leveraging various data storage solutions to efficiently extract, transform, and load data for machine learning workflows.  Collaborate closely with data scientists, software engineers, and product managers to translate business requirements into effective, scalable, and maintainable ML solutions.  Implement and maintain robust MLOps practices, including version control, model monitoring, logging, and performance evaluation to ensure model reliability and drive continuous improvement.  Research and experiment with new machine learning techniques, tools, and technologies to enhance our predictive capabilities and operational efficiency. Required Skills & Experience  5+ years of hands-on experience in building, training, and deploying machine learning models in a professional, production-oriented setting.  Demonstrable experience with database creation and advanced querying (e.g., SQL, NoSQL), with a strong understanding of data warehousing concepts.  Proven expertise in data blending, transformation, and feature engineering, adept at integrating and harmonizing both structured (e.g., relational databases, CSVs) and unstructured (e.g., text, logs, images) data.  Strong practical experience with cloud platforms for machine learning development and deployment; significant experience with Google Cloud Platform (GCP) services (e.g., Vertex AI, BigQuery, Dataflow) is highly desirable.  Proficiency in programming languages commonly used in data science (e.g., Python is preferred, R).  Solid understanding of various machine learning algorithms (e.g., regression, classification, clustering, dimensionality reduction) and experience with advanced techniques like Deep Learning, Natural Language Processing (NLP), or Computer Vision.  Experience with machine learning libraries and frameworks (e.g., scikit-learn, TensorFlow, PyTorch).  Familiarity with MLOps tools and practices, including model versioning, monitoring, A/B testing, and continuous integration/continuous deployment (CI/CD) pipelines.  Experience with containerization technologies like Docker and orchestration tools like Kubernetes for deploying ML models as REST APIs.  Proficiency with version control systems (e.g., Git, GitHub/GitLab) for collaborative development. Educational Background  Bachelor's or Master's degree in Statistics, Mathematics, Computer Science, Engineering, Data Science, or a closely related quantitative field.  Alternatively, a significant certification in Data Science, Machine Learning, or Cloud AI combined with relevant practical experience will be considered.  A compelling combination of relevant education and professional experience will also be valued. Interested Candidates can share their Resume to the below mentioned Email I.D tarunrai@voicebaysolutions.in hr@voicebaysolutions.in Show more Show less

Posted 11 hours ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

This posting is for one of our International Clients. About the Role We’re creating a new certification: Inside Gemini: Gen AI Multimodal and Google Intelligence (Google DeepMind) . This course is designed for technical learners who want to understand and apply the capabilities of Google’s Gemini models and DeepMind technologies to build powerful, multimodal AI applications. We’re looking for a Subject Matter Expert (SME) who can help shape this course from the ground up. You’ll work closely with a team of learning experience designers, writers, and other collaborators to ensure the course is technically accurate, industry-relevant, and instructionally sound. Responsibilities As the SME, you’ll partner with learning experience designers and content developers to: Translate real-world Gemini and DeepMind applications into accessible, hands-on learning for technical professionals. Guide the creation of labs and projects that allow learners to build pipelines for image-text fusion, deploy Gemini APIs, and experiment with DeepMind’s reinforcement learning libraries. Contribute technical depth across activities, from high-level course structure down to example code, diagrams, voiceover scripts, and data pipelines. Ensure all content reflects current, accurate usage of Google’s multimodal tools and services. Be available during U.S. business hours to support project milestones, reviews, and content feedback. This role is an excellent fit for professionals with deep experience in AI/ML, Google Cloud, and a strong familiarity with multimodal systems and the DeepMind ecosystem. Essential Tools & Platforms A successful SME in this role will demonstrate fluency and hands-on experience with the following: Google Cloud Platform (GCP) Vertex AI (particularly Gemini integration, model tuning, and multimodal deployment) Cloud Functions, Cloud Run (for inference endpoints) BigQuery and Cloud Storage (for handling large image-text datasets) AI Platform Notebooks or Colab Pro Google DeepMind Technologies JAX and Haiku (for neural network modeling and research-grade experimentation) DeepMind Control Suite or DeepMind Lab (for reinforcement learning demonstrations) RLax or TF-Agents (for building and modifying RL pipelines) AI/ML & Multimodal Tooling Gemini APIs and SDKs (image-text fusion, prompt engineering, output formatting) TensorFlow 2.x and PyTorch (for model interoperability) Label Studio, Cloud Vision API (for annotation and image-text preprocessing) Data Science & MLOps DVC or MLflow (for dataset and model versioning) Apache Beam or Dataflow (for processing multimodal input streams) TensorBoard or Weights & Biases (for visualization) Content Authoring & Collaboration GitHub or Cloud Source Repositories Google Docs, Sheets, Slides Screen recording tools like Loom or OBS Studio Required skills and experience: Demonstrated hands-on experience building, deploying, and maintaining sophisticated AI powered applications using Gemini APIs/SDKs within the Google Cloud ecosystem, especially in Firebase Studio and VS Code. Proficiency in designing and implementing agent-like application patterns, including multi-turn conversational flows, state management, and complex prompting strategies (e.g., Chain-of Thought, few-shot, zero-shot). Experience integrating Gemini with Google Cloud services (Firestore, Cloud Functions, App Hosting) and external APIs for robust, production-ready solutions. Proven ability to engineer applications that process, integrate, and generate content across multiple modalities (text, images, audio, video, code) using Gemini’s native multimodal capabilities. Skilled in building and orchestrating pipelines for multimodal data handling, synchronization, and complex interaction patterns within application logic. Experience designing and implementing production-grade RAG systems, including integration with vector databases (e.g., Pinecone, ChromaDB) and engineering data pipelines for indexing and retrieval. Ability to manage agent state, memory, and persistence for multi-turn and long-running interactions. Proficiency leveraging AI-assisted coding features in Firebase Studio (chat, inline code, command execution) and using App Prototyping agents or frameworks like Genkit for rapid prototyping and structuring agentic logic. Strong command of modern development workflows, including Git/GitHub, code reviews, and collaborative development practices. Experience designing scalable, fault-tolerant deployment architectures for multimodal and agentic AI applications using Firebase App Hosting, Cloud Run, or similar serverless/cloud platforms. Advanced MLOps skills, including monitoring, logging, alerting, and versioning for generative AI systems and agents. Deep understanding of security best practices: prompt injection mitigation (across modalities), secure API key management, authentication/authorization, and data privacy. Demonstrated ability to engineer for responsible AI, including bias detection, fairness, transparency, and implementation of safety mechanisms in agentic and multimodal applications. Experience addressing ethical challenges in the deployment and operation of advanced AI systems. Proven success designing, reviewing, and delivering advanced, project-based curriculum and hands-on labs for experienced software developers and engineers. Ability to translate complex engineering concepts (RAG, multimodal integration, agentic patterns, MLOps, security, responsible AI) into clear, actionable learning materials and real world projects. 5+ years of professional experience in AI-powered application development, with a focus on generative and multimodal AI. Strong programming skills in Python and JavaScript/TypeScript; experience with modern frameworks and cloud-native development. Bachelor’s or Master’s degree in Computer Science, Data Engineering, AI, or a related technical field. Ability to explain advanced technical concepts (e.g., fusion transformers, multimodal embeddings, RAG workflows) to learners in an accessible way. Strong programming experience in Python and experience deploying machine learning pipelines Ability to work independently, take ownership of deliverables, and collaborate closely with designers and project managers Preferred: Experience with Google DeepMind tools (JAX, Haiku, RLax, DeepMind Control Suite/Lab) and reinforcement learning pipelines. Familiarity with open data formats (Delta, Parquet, Iceberg) and scalable data engineering practices. Prior contributions to open-source AI projects or technical community engagement. Show more Show less

Posted 11 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Title - QA Manual Testing Experience - 5-8 Years Location - Pune & Gurgaon (Hybrid) Key Responsibilities: Understand business requirements and data flows to create comprehensive test plans and test cases for ETL jobs. Perform data validation and reconciliation between source systems, staging, and target data stores (DWH, data lakes, etc.). Develop and execute automated and manual tests to ensure data accuracy and quality. Work with SQL queries to validate data transformations and detect anomalies. Identify, document, and track defects and inconsistencies in data processing. Collaborate with data engineering and BI teams to improve ETL processes and data pipelines. Maintain QA documentation and contribute to continuous process improvements. Must Have Skills: Strong SQL skills – ability to write complex queries for data validation and transformation testing. Hands-on experience in ETL testing – validating data pipelines, transformations, and data loads. Knowledge of data warehousing concepts – dimensions, facts, slowly changing dimensions (SCD), etc. Experience in test case design, execution, and defect tracking . Experience with QA tools like JIRA , TestRail , or equivalent. Ability to work independently and collaboratively in an Agile/Scrum environment. Good to Have Skills: Experience with ETL tools like Informatica, Talend, DataStage , or Azure/AWS/GCP native ETL services (e.g., Dataflow, Glue). Knowledge of automation frameworks using Python/Selenium/pytest or similar tools for data testing. Familiarity with cloud data platforms – Snowflake, BigQuery, Redshift, etc. Basic understanding of CI/CD pipelines and QA integration. Exposure to data quality tools such as Great Expectations , Deequ , or DQ frameworks . Understanding of reporting/BI tools such as Power BI, Tableau, or Looker. Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. Show more Show less

Posted 11 hours ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

About Lemongrass Lemongrass is a software-enabled services provider, synonymous with SAP on Cloud, focused on delivering superior, highly automated Managed Services to Enterprise customers. Our customers span multiple verticals and geographies across the Americas, EMEA and APAC. We partner with AWS, SAP, Microsoft and other global technology leaders. We are seeking an experienced Cloud Data Engineer with a strong background in AWS, Azure, and GCP. The ideal candidate will have extensive experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, and other ETL tools like Informatica, SAP Data Intelligence, etc. You will be responsible for designing, implementing, and maintaining robust data pipelines and building scalable data lakes. Experience with various data platforms like Redshift, Snowflake, Databricks, Synapse, Snowflake and others is essential. Familiarity with data extraction from SAP or ERP systems is a plus. Key Responsibilities: Design and Development: Design, develop, and maintain scalable ETL pipelines using cloud-native tools (AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc.). Architect and implement data lakes and data warehouses on cloud platforms (AWS, Azure, GCP). Develop and optimize data ingestion, transformation, and loading processes using Databricks, Snowflake, Redshift, BigQuery and Azure Synapse. Implement ETL processes using tools like Informatica, SAP Data Intelligence, and others. Develop and optimize data processing jobs using Spark Scala. Data Integration and Management: Integrate various data sources, including relational databases, APIs, unstructured data, and ERP systems into the data lake. Ensure data quality and integrity through rigorous testing and validation. Perform data extraction from SAP or ERP systems when necessary. Performance Optimization: Monitor and optimize the performance of data pipelines and ETL processes. Implement best practices for data management, including data governance, security, and compliance. Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. Collaborate with cross-functional teams to design and implement data solutions that meet business needs. Documentation and Maintenance: Document technical solutions, processes, and workflows. Maintain and troubleshoot existing ETL pipelines and data integrations. Qualifications Education: Bachelor’s degree in Computer Science, Information Technology, or a related field. Advanced degrees are a plus. Experience: 7+ years of experience as a Data Engineer or in a similar role. Proven experience with cloud platforms: AWS, Azure, and GCP. Hands-on experience with cloud-native ETL tools such as AWS DMS, AWS Glue, Kafka, Azure Data Factory, GCP Dataflow, etc. Experience with other ETL tools like Informatica, SAP Data Intelligence, etc. Experience in building and managing data lakes and data warehouses. Proficiency with data platforms like Redshift, Snowflake, BigQuery, Databricks, and Azure Synapse. Experience with data extraction from SAP or ERP systems is a plus. Strong experience with Spark and Scala for data processing. Skills: Strong programming skills in Python, Java, or Scala. Proficient in SQL and query optimization techniques. Familiarity with data modeling, ETL/ELT processes, and data warehousing concepts. Knowledge of data governance, security, and compliance best practices. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Preferred Qualifications: Experience with other data tools and technologies such as Apache Spark, or Hadoop. Certifications in cloud platforms (AWS Certified Data Analytics – Specialty, Google Professional Data Engineer, Microsoft Certified: Azure Data Engineer Associate). Experience with CI/CD pipelines and DevOps practices for data engineering Selected applicant will be subject to a background investigation, which will be conducted and the results of which will be used in compliance with applicable law. What we offer in return: Remote Working: Lemongrass always has been and always will offer 100% remote work Flexibility: Work where and when you like most of the time Training: A subscription to A Cloud Guru and generous budget for taking certifications and other resources you’ll find helpful State of the art tech: An opportunity to learn and run the latest industry standard tools Team: Colleagues who will challenge you giving the chance to learn from them and them from you Lemongrass Consulting is proud to be an Equal Opportunity and Affirmative Action employer. We do not discriminate on the basis of race, religion, color, national origin, religious creed, gender, sexual orientation, gender identity, gender expression, age, genetic information, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics Show more Show less

Posted 12 hours ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Title: Data Modeler / Data Analyst Experience : 6 – 8 Years Location : Pune Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platform—primarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities · Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). · Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. · Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. · Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. · Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). · Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. · Conduct impact assessments for schema changes and guide version-control processes for data models. · Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. · Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills · 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. · Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. · Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). · Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. · Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. · Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). · Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have · Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education · Bachelor’s or master’s degree in computer science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered. Show more Show less

Posted 13 hours ago

Apply

2.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What You’ll Do As a Senior Developer at Equifax, you will oversee and steer the delivery of innovative batch and data products, primarily leveraging Java and the Google Cloud Platform (GCP). Your expertise will ensure the efficient and timely deployment of high quality data solutions that support our business objectives and client needs. Project Development: Development, and deployment of batch and data products, ensuring alignment with business goals and technical requirements. Technical Oversight: Provide technical direction and oversee the implementation of solutions using Java and GCP, ensuring best practices in coding, performance, and security. Team Management: Mentor and guide junior developers and engineers, fostering a collaborative and high performance environment. Stakeholder Collaboration: Work closely with cross functional teams including product managers, business analysts, and other stakeholders to gather requirements and translate them into technical solutions. Documentation and Compliance: Maintain comprehensive documentation for all technical processes, ensuring compliance with internal and external standards. Continuous Improvement: Advocate for and implement continuous improvement practices within the team, staying abreast of emerging technologies and methodologies. What Experience You Need Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Minimum of 2-5 years of relevant experience in software development, with a focus on batch processing and data solutions. Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including BigQuery, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Soft Skills: Strong problem solving abilities and a proactive approach to project management. Effective communication and interpersonal skills, with the ability to convey technical concepts to nontechnical stakeholders. What Could Set You Apart Technical Skills: Proficiency in Java, with a strong understanding of its ecosystems. 2+ year of relevant Java, Sprint, Spring Boot, REST, Microservices, Hibernate, JPA, RDBMS Minimum 2 Git, CI/CD Pipelines, Jenkins Experience with Google Cloud Platform, including Big Query, Cloud Storage, Dataflow, Big Table and other GCP tools. Familiarity with ETL processes, data modeling, and SQL. Certification in Google Cloud (e.g., Associate Cloud Engineer). Experience with other cloud platforms (AWS, Azure) is a plus. Understanding of data privacy regulations and compliance frameworks. We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 13 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. Equifax is seeking creative, high-energy and driven software engineers with hands-on development skills to work on a variety of meaningful projects. Our software engineering positions provide you the opportunity to join a team of talented engineers working with leading-edge technology. You are ideal for this position if you are a forward-thinking, committed, and enthusiastic software engineer who is passionate about technology. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Manage sole project priorities, deadlines, and deliverables. Research, create, and develop software applications to extend and improve on Equifax Solutions Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 14 hours ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We are looking for a highly skilled and motivated Data Scientist with deep experience in building recommendation systems to join our team. This role demands expertise in deep learning, embedding-based retrieval, and the Google Cloud Platform (GCP). You will play a critical role in developing intelligent systems that enhance user experiences through personalized content discovery. Key Responsibilities: Develop, train, and deploy recommendation models using two-tower, multi-tower, and cross-encoder architectures . Generate and utilize text/image embeddings (e.g., CLIP , BERT , Sentence Transformers ) for content-based recommendations. Design semantic similarity search pipelines using vector databases (FAISS, ScaNN, Qdrant, Matching Engine). Create and manage scalable ML pipelines using Vertex AI , Kubeflow Pipelines , and GKE . Handle large-scale data preparation and feature engineering using Dataproc (PySpark) and Dataflow . Implement cold-start strategies leveraging metadata and multimodal embeddings. Work on user modeling , temporal personalization , and re-ranking strategies . Run A/B tests and interpret results to measure real-world impact. Collaborate with cross-functional teams (Engineering, Product, DevOps) for model deployment and monitoring. Must-Have Skills: Strong command of Python and ML libraries: pandas, polars, numpy, scikit-learn, matplotlib, tensorflow, torch, transformers. Deep understanding of modern recommender systems and embedding-based retrieval . Experience with TensorFlow , Keras , or PyTorch for building deep learning models. Hands-on with semantic search , ANN search , and real-time vector matching . Proven experience with Vertex AI , Kubeflow on GKE , and ML pipeline orchestration. Familiarity with vector DBs such as Qdrant , FAISS , ScaNN , or Matching Engine on GCP. Experience in deploying models via Vertex AI Online Prediction , TF Serving , or Cloud Run . Knowledge of feature stores , embedding versioning , and MLOps practices (CI/CD, monitoring). Preferred / Good to Have: Experience with ranking models (e.g., XGBoost , LightGBM , DLRM ) for candidate scoring. Exposure to LLM-powered personalization or hybrid retrieval systems. Familiarity with streaming pipelines using Pub/Sub , Dataflow , Cloud Functions . Hands-on with multi-modal retrieval (text + image + tabular data). Strong grasp of cold-start problem solving , using enriched metadata and embeddings. GCP Stack You’ll Work With: ML & Pipelines: Vertex AI, Vertex Pipelines, Kubeflow on GKE Embedding & Retrieval: Matching Engine, Qdrant, FAISS, ScaNN, Milvus Processing: Dataproc (PySpark), Dataflow Ingestion & Serving: Pub/Sub, Cloud Functions, Cloud Run, TF Serving CI/CD & Automation: GitHub Actions, GitLab CI, Terraform Show more Show less

Posted 15 hours ago

Apply

10.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP Ensure data architecture aligns with business goals, governance, and compliance requirements Collaborate with stakeholders to define data strategy and roadmap Design and deploy BigQuery solutions for optimized performance and cost efficiency Build and maintain ETL/ELT pipelines for large-scale data processing Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration Implement best practices for data security, privacy, and compliance in cloud environments Integrate machine learning workflows with data pipelines and analytics tools Define data governance frameworks and manage data lineage Lead data modeling efforts to ensure consistency, accuracy, and performance across systems Optimize cloud infrastructure for scalability, performance, and reliability Mentor junior team members and ensure adherence to architectural standards Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems Conduct technical reviews, audits, and performance tuning for data solutions Design solutions for multi-region and multi-cloud data architecture Stay updated on emerging technologies and trends in data engineering and GCP Drive innovation in data architecture, recommending new tools and services on GCP Certifications : Google Cloud Certification is Preferred Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services Strong experience in data warehousing, data lakes, and real-time data pipelines Proficiency in SQL, Python, or other data processing languages Experience with cloud security, data governance, and compliance frameworks Strong problem-solving skills and ability to architect solutions for complex data environments Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred Leadership experience and ability to mentor technical teams Excellent communication and collaboration skills Show more Show less

Posted 17 hours ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Introduction We are looking for candidates with 10 +years of experience in data architect role. Responsibilities include: Design and implement scalable, secure, and cost-effective data architectures using GCP Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP Ensure data architecture aligns with business goals, governance, and compliance requirements Collaborate with stakeholders to define data strategy and roadmap Design and deploy BigQuery solutions for optimized performance and cost efficiency Build and maintain ETL/ELT pipelines for large-scale data processing Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration Implement best practices for data security, privacy, and compliance in cloud environments Integrate machine learning workflows with data pipelines and analytics tools Define data governance frameworks and manage data lineage Lead data modeling efforts to ensure consistency, accuracy, and performance across systems Optimize cloud infrastructure for scalability, performance, and reliability Mentor junior team members and ensure adherence to architectural standards Collaborate with DevOps teams to implement Infrastructure as Code (Terraform, Cloud Deployment Manager). Ensure high availability and disaster recovery solutions are built into data systems Conduct technical reviews, audits, and performance tuning for data solutions Design solutions for multi-region and multi-cloud data architecture Stay updated on emerging technologies and trends in data engineering and GCP Drive innovation in data architecture, recommending new tools and services on GCP Certifications : Google Cloud Certification is Preferred Primary Skills : 7+ years of experience in data architecture, with at least 3 years in GCP environments Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services Strong experience in data warehousing, data lakes, and real-time data pipelines Proficiency in SQL, Python, or other data processing languages Experience with cloud security, data governance, and compliance frameworks Strong problem-solving skills and ability to architect solutions for complex data environments Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred Leadership experience and ability to mentor technical teams Excellent communication and collaboration skills Show more Show less

Posted 17 hours ago

Apply

0.0 years

0 Lacs

Gurugram, Haryana

Remote

Indeed logo

Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana

Posted 18 hours ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less

Posted 1 day ago

Apply

5.0 - 7.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Linkedin logo

You are passionate about quality and how customers experience the products you test. You have the ability to create, maintain and execute test plans in order to verify requirements. As a Quality Engineer at Equifax, you will be a catalyst in both the development and the testing of high priority initiatives. You will develop and test new products to support technology operations while maintaining exemplary standards. As a collaborative member of the team, you will deliver QA services (code quality, testing services, performance engineering, development collaboration and continuous integration). You will conduct quality control tests in order to ensure full compliance with specified standards and end user requirements. You will execute tests using established plans and scripts; documents problems in an issues log and retest to ensure problems are resolved. You will create test files to thoroughly test program logic and verify system flow. You will identify, recommend and implement changes to enhance effectiveness of QA strategies. What You Will Do Independently develop scalable and reliable automated tests and frameworks for testing software solutions. Specify and automate test scenarios and test data for a highly complex business by analyzing integration points, data flows, personas, authorization schemes and environments Develop regression suites, develop automation scenarios, and move automation to an agile continuous testing model. Pro-actively and collaboratively taking part in all testing related activities while establishing partnerships with key stakeholders in Product, Development/Engineering, and Technology Operations. What Experience You Need Bachelor's degree in a STEM major or equivalent experience 5-7 years of software testing experience Able to create and review test automation according to specifications Ability to write, debug, and troubleshoot code in Java, Springboot, TypeScript/JavaScript, HTML, CSS Creation and use of big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others with respect to software validation Created test strategies and plans Led complex testing efforts or projects Participated in Sprint Planning as the Test Lead Collaborated with Product Owners, SREs, Technical Architects to define testing strategies and plans. Design and development of micro services using Java, Springboot, GCP SDKs, GKE/Kubeneties Deploy and release software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs Cloud Certification Strongly Preferred What Could Set You Apart An ability to demonstrate successful performance of our Success Profile skills, including: Attention to Detail - Define test case candidates for automation that are outside of product specifications. i.e. Negative Testing; Create thorough and accurate documentation of all work including status updates to summarize project highlights; validating that processes operate properly and conform to standards Automation - Automate defined test cases and test suites per project Collaboration - Collaborate with Product Owners and development team to plan and and assist with user acceptance testing; Collaborate with product owners, development leads and architects on functional and non-functional test strategies and plans Execution - Develop scalable and reliable automated tests; Develop performance testing scripts to assure products are adhering to the documented SLO/SLI/SLAs; Specify the need for Test Data types for automated testing; Create automated tests and tests data for projects; Develop automated regression suites; Integrate automated regression tests into the CI/CD pipeline; Work with teams on E2E testing strategies and plans against multiple product integration points Quality Control - Perform defect analysis, in-depth technical root cause analysis, identifying trends and recommendations to resolve complex functional issues and process improvements; Analyzes results of functional and non-functional tests and make recommendation for improvements; Performance / Resilience: Understanding application and network architecture as inputs to create performance and resilience test strategies and plans for each product and platform. Conducting the performance and resilience testing to ensure the products meet SLAs / SLOs Quality Focus - Review test cases for complete functional coverage; Review quality section of Production Readiness Review for completeness; Recommend changes to existing testing methodologies for effectiveness and efficiency of product validation; Ensure communications are thorough and accurate for all work documentation including status and project updates Risk Mitigation - Work with Product Owners, QE and development team leads to track and determine prioritization of defects fixes Show more Show less

Posted 1 day ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 day ago

Apply

7.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description: We are seeking an experienced Engineer with strong expertise in PostgreSQL, PL/SQL programming, and cloud-based data migration. The ideal candidate will have hands-on experience in migrating and tuning databases, particularly from Oracle to PostgreSQL on GCP (AlloyDB / Cloud SQL), and be skilled in modern data architecture and cloud services. Locations - Indore/Bengaluru/Noida Key Responsibilities Design, build, test, and maintain scalable data architectures on GCP. Lead Oracle to PostgreSQL data migration initiatives (preferably AlloyDB / Cloud SQL). Optimize PostgreSQL performance (e.g., tuning autovacuum, stored procedures). Translate Oracle PL/SQL code to PostgreSQL equivalents. Integrate hybrid data storage using GCP services (BigQuery, Firestore, MemoryStore, Spanner). Implement database job scheduling, disaster recovery, and logging. Work with GCP Dataflow, MongoDB, and data migration services. Mentor and lead database engineering teams. Required Technical Skills Advanced PostgreSQL & PL/SQL programming (queries, procedures, functions). Strong experience with database migration (Oracle ➝ PostgreSQL on GCP). Proficient in Cloud SQL, AlloyDB, and performance tuning. Hands-on experience with BigQuery, Firestore, Spanner, MemoryStore, MongoDB, Cloud Dataflow. Understanding of OLTP and OLAP systems. Desirable Qualifications GCP Database Engineer Certification Exposure to Enterprise Architecture, Project Delivery, and Performance Benchmarking Strong analytical, problem-solving, and leadership skills. Years Of Experience- 7 to 10 Years Education/Qualification- BE / B.Tech / MCA / M.Tech / M.Com Interested candidates can directly share their resume at anubhav.pathania@impetus.com Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Roles and Responsibilities This role involves architect, designing end-to-end solutions and a blueprint for implementing/deployment of solutions based on Google Cloud Platform (GCP) to enhance operational efficiency and drive digital and Data transformation within the industrial & government sectors. Architect and Design scalable, secure, and cost-effective cloud solutions on GCP tailored to the needs of industrial & government sectors focused on Industrial Data lake and Analytics Provide expert-level guidance on cloud adoption, data migration strategies, and digital transformation projects specific to industrial & government sectors. Develop blueprints of implementation and deployment of GCP services, including compute, storage, networking, and data analytics, ensuring alignment with best practices and client requirements. Deliver technical and pre-sales services on the GCP cloud platform. Services and activities ranging from Data migration, End-to-End data pipelines, Data management, Sizing & Provisioning, and Implementation. Job Scope Full-time Cloud (GCP) Specialist/Consultant with a strong background in Make this secondary skillPresales and Technical services on the GCP Cloud platform. Develop scalable and secure data lake and data warehouse architectures using services like BigQuery, Cloud Storage, Dataproc, and Pub/Sub. But not limited to. Design and implement efficient data ingestion, transformation, and processing pipelines using Cloud Dataflow, Apache Beam, and Dataproc/Spark.But not limited to Design and implement data security and governance using IAM, VPC Service Controls, Cloud DLP, and integrate with data governance tools (e.g., Data Catalog). Building Technical solution proposals with solution architectures, sizing, and planning for prospective bids in the space of the GCP cloud platform. Grooming & Training internal and external technical & non-technical users of solutions and components in the GCP cloud platform. Create prototypes and demonstrate solutions on GCP platform (streaming ingestions, analytical products etc). Job Location - Bangalore Work model - Hybrid (Sunday (Work From Home) - Thursday (Work from office) Show more Show less

Posted 1 day ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Company Description ThreatXIntel is a startup cyber security company dedicated to protecting businesses and organizations from cyber threats. Our services include cloud security, web and mobile security testing, cloud security assessment, and DevSecOps. We provide customized, affordable solutions tailored to meet the specific needs of our clients, regardless of their size. Role Description We are seeking a freelance GCP Data Engineer with expertise in Scala , Apache Spark , Airflow , and experience with Automic and Laminar frameworks. The role focuses on designing and maintaining scalable data pipelines and workflow automation within the Google Cloud Platform ecosystem. Key Responsibilities Design, build, and optimize data pipelines using Scala and Apache Spark on Google Cloud Platform (GCP) Orchestrate ETL workflows using Apache Airflow Integrate and automate data processing using Automic job scheduling Utilize Laminar for reactive programming or stream processing within pipelines (if applicable) Collaborate with cross-functional teams to define data flows and transformations Ensure pipeline performance, scalability, and monitoring across environments Troubleshoot and resolve issues in batch and streaming data processes Required Skills Strong programming skills in Scala Hands-on experience with Apache Spark for distributed data processing Experience working with GCP data services (e.g., BigQuery, Cloud Storage, Dataflow preferred) Proficiency with Airflow for workflow orchestration Experience using Automic for job scheduling Familiarity with Laminar or similar frameworks for reactive or stream-based processing Good understanding of data engineering best practices and pipeline optimization Ability to work independently and communicate effectively with remote teams Show more Show less

Posted 1 day ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies