Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us.At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within PWC Job Description: We are seeking skilled and dynamic Cloud Data Engineers specializing in AWS, Azure, Databricks, and GCP. The ideal candidate will have a strong background in data engineering, with a focus on data ingestion, transformation, and warehousing. They should also possess excellent knowledge of PySpark or Spark, and a proven ability to optimize performance in Spark job executions. Responsibilities Design, build, and maintain scalable data pipelines for a variety of cloud platforms including AWS, Azure, Databricks, and GCP. Implement data ingestion and transformation processes to facilitate efficient data warehousing. Utilize cloud services to enhance data processing capabilities: - AWS: Glue, Athena, Lambda, Redshift, Step Functions, DynamoDB, SNS. Azure: Data Factory, Synapse Analytics, Functions, Cosmos DB, Event Grid, Logic Apps, Service Bus. GCP: Dataflow, BigQuery, DataProc, Cloud Functions, Bigtable, Pub/Sub, Data Fusion. Optimize Spark job performance to ensure high efficiency and reliability. Stay proactive in learning and implementing new technologies to improve data processing frameworks. Collaborate with cross-functional teams to deliver robust data solutions. Work on Spark Streaming for real-time data processing as necessary. Qualifications 4-7 years of experience in data engineering with a strong focus on cloud environments. - Proficiency in PySpark or Spark is mandatory. Proven experience with data ingestion, transformation, and data warehousing. In-depth knowledge and hands-on experience with cloud services(AWS/Azure/GCP): Demonstrated ability in performance optimization of Spark jobs. Strong problem-solving skills and the ability to work independently as well as in a team. Cloud Certification (AWS, Azure, or GCP) is a plus. - Familiarity with Spark Streaming is a bonus. Mandatory Skill Sets Python, Pyspark, SQL with (AWS or Azure or GCP) Preferred Skill Sets Python, Pyspark, SQL with (AWS or Azure or GCP) Years Of Experience Required 4-7 years Education Qualification BE/BTECH, ME/MTECH, MBA, MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Master of Business Administration, Bachelor of Engineering, Bachelor of Technology Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Full Stack Development Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline, Data Quality, Data Transformation, Data Validation {+ 19 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Experience in data analysis, with a strong ability to interpret complex data sets. Knowledge of pharmaceutical commercial operations is highly desirable. Proficiency in data analysis tools and software. Strong analytical and problem-solving skills. Excellent communication skills to effectively convey insights to stakeholders. Ability to work independently as well as collaboratively in a team environment. Mandatory skill: Data Analyst Preferred Skill Sets- Data Analyst Years of experience required:4-8 Education qualification:Btech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Technology, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills GCP Dataflow, Good Clinical Practice (GCP) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 12 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We are seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects involving data integration and ETL for cloud-based platforms. Your tasks will include creating and executing sophisticated data solutions, ensuring the integrity, reliability, and accessibility of data. Responsibilities Create and execute sophisticated data solutions for cloud-based platforms Construct ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Enhance data integration processes for performance and efficiency, upholding data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Background in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Exceptional problem-solving abilities and meticulous attention to detail Strong verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Company Description Jobs by Divya Bhardwaj is a LinkedIn page dedicated to listing job opportunities across Europe, the USA, and India for various skill sets. By posting jobs, we assist our connections in finding suitable positions. Our platform helps candidates connect directly for further information. Follow us for the latest job updates. Role Description This is a contract remote role for an AI Architect CCAI. The AI Architect will be responsible for designing and implementing AI architectures, developing software solutions, integrating systems, and managing projects. The role includes collaboration with cross-functional teams to create architectural designs and ensure successful project delivery. We're Hiring: AI Architect with GCCAI Expertise Location: PAN India (Remote flexibility) Join us on an exciting AI transformation journey for a leading retail client in the Philippines. This project is focused on modernizing Contact Center operations #GoogleCloud CCAI #GCP technologies—driving smarter customer interactions, reduced wait times, and elevated agent productivity. 👨💻 Key Skills We’re Looking For: ✅ Google #CCAI , Vertex AI, GPT-4, PaLM ✅ Dialogflow CX (Voice & Text) #GenAI Applications: Summarization, Q&A, Document Intelligence ✅ Data Engineering #BigQuery , #Dataflow , Pub/Sub) ✅ MLOps (Vertex Pipelines, Kubeflow, CI/CD) ✅ LangChain, Pinecone, LlamaIndex ✅ Telephony & IVR Integration, Voice AI ✅ STT, TTS, Agent Assist, Real-time Transcription ✅ AI Governance (HIPAA, GDPR), Ethical AI Practices ✅ Dashboarding & Reporting (Looker, Tableau) 📌 Looking for experts who can lead architecture, collaborate cross-functionally, and build enterprise-scale conversational AI solutions. 🔗 Interested or know someone perfect for this role? Show more Show less
Posted 1 month ago
5.0 - 10.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Role: Lead/Manager Overview: We are seeking a visionary and dynamic individual to lead our AI initiatives and data-driven strategies. This role is crucial in shaping the future of our company by leveraging advanced technologies to drive innovation and growth. The ideal candidate will possess a deep understanding of AI, machine learning, and data analytics, along with a proven track record in leadership and strategic execution. Key Responsibilities: Self-Driven Initiative: Take ownership of projects and drive them to successful completion with minimal supervision, demonstrating a proactive and entrepreneurial mindset. Stakeholder Communication: Present insights, findings, and strategic recommendations to senior management and key stakeholders, fostering a data-driven decision-making culture. Executive Collaboration: Report directly to the founders and collaborate with other senior leaders to shape the company's direction and achieve our ambitious goals. Innovation & Problem-Solving: Foster a culture of innovative thinking and creative problem-solving to tackle complex challenges and drive continuous improvement. AI Research & Development: Oversee AI research and development initiatives, ensuring the integration of cutting-edge technologies and methodologies. Data Management: Ensure effective data collection, management, and analysis to support AI-driven decision-making and product development. Required Skills and Qualifications: Bachelor's degree from a Tier 1 institution or an MBA from a recognized institution. Proven experience in a managerial role, preferably in a startup environment. Strong leadership and team management skills. Excellent strategic thinking and problem-solving abilities. Exceptional communication and interpersonal skills. Ability to thrive in a fast-paced, dynamic environment. Entrepreneurial mindset with a passion for innovation and growth. Extensive experience with AI technologies, machine learning, and data analytics. Proficiency in programming languages such as Python, R, or similar. Familiarity with data visualization tools like Tableau, Power BI, or similar. Strong understanding of data governance, privacy, and security best practices. Technical Skills: Machine Learning Frameworks: Expertise in frameworks such as TensorFlow, PyTorch, or Scikit-learn. Data Processing: Proficiency in using tools like Apache Kafka, Apache Flink, or Apache Beam for real-time data processing. Database Management: Experience with SQL and NoSQL databases, including MySQL, PostgreSQL, MongoDB, or Cassandra. Big Data Technologies: Hands-on experience with Hadoop, Spark, Hive, or similar big data technologies. Cloud Computing: Strong knowledge of cloud services and infrastructure, including AWS (S3, EC2, SageMaker), Google Cloud (BigQuery, Dataflow), or Azure (Data Lake, Machine Learning). DevOps and MLOps: Familiarity with CI/CD pipelines, containerization (Docker, Kubernetes), and orchestration tools for deploying and managing machine learning models. Data Visualization: Advanced skills in data visualization tools such as Tableau, Power BI, or D3.js to create insightful and interactive dashboards. Natural Language Processing (NLP): Experience with NLP techniques and tools like NLTK, SpaCy, or BERT for text analysis and processing. Large Language Models (LLMs): Proficiency in working with LLMs such as GPT-3, GPT-4, or similar for natural language understanding and generation tasks. Computer Vision: Knowledge of computer vision technologies and libraries such as OpenCV, YOLO, or TensorFlow Object Detection API. Preferred Experience: 5-10 years of relevant experience Proven Track Record: Demonstrated success in scaling businesses or leading teams through significant growth phases, showcasing your ability to drive impactful results. AI Expertise: Deep familiarity with the latest AI tools and technologies, including Generative AI applications, with a passion for staying at the forefront of technological advancements. Startup Savvy: Hands-on experience in early-stage startups, with a proven ability to navigate the Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Responsibilities include: • Design and implement scalable, secure, and cost-effective data architectures using GCP. • Lead the design and development of data pipelines with BigQuery, Dataflow, and Cloud Storage. • Architect and implement data lakes, data warehouses, and real-time data processing solutions on GCP. • Ensure data architecture aligns with business goals, governance, and compliance requirements. • Collaborate with stakeholders to define data strategy and roadmap. • Design and deploy BigQuery solutions for optimized performance and cost efficiency. • Build and maintain ETL/ELT pipelines for large-scale data processing. • Leverage Cloud Pub/Sub, Dataflow, and Cloud Functions for real-time data integration. Need 10+ YOE Must have: • 7+ years of experience in data architecture, with at least 3 years in GCP environments. • Expertise in BigQuery, Cloud Dataflow, Cloud Pub/Sub, Cloud Storage, and related GCP services. • Strong experience in data warehousing, data lakes, and real-time data pipelines . • Proficiency in SQL, Python , or other data processing languages . • Experience with cloud security, data governance, and compliance frameworks. • Strong problem-solving skills and ability to architect solutions for complex data environments. • Google Cloud Certification (Professional Data Engineer, Professional Cloud Architect) preferred. • Leadership experience and ability to mentor technical teams. • Excellent communication and collaboration skills. Certifications : • Google Cloud Certification is Preferred. Show more Show less
Posted 1 month ago
6.0 years
0 Lacs
India
On-site
Experience : 6+years Preferred Qualifications: Bachelor’s degree in computer science, Information Systems, or related field. 6-12 years of relevant experience in cloud engineering and architecture. Google Cloud Professional Cloud Architect certification. Experience with Kubernetes. Familiarity with DevOps methodologies. Strong problem-solving and analytical skills. Excellent communication skills. Required Skills: Google Cloud Platform (GCP) Services, Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, Identity and Access Management (IAM), Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway, Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging and Error Reporting, Python, Terraform, Google Cloud Firestore, GraphQL, MongoDB, Cassandra, Neo4j, ETL (Extract, Transform, Load) Paradigms, Google Cloud Dataflow, Apache Beam, BigQuery, Service Mesh, Content Delivery Network (CDN), Stackdriver, Google Cloud Trace Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Equifax is where you can power your possible. If you want to achieve your true potential, chart new paths, develop new skills, collaborate with bright minds, and make a meaningful impact, we want to hear from you. What You’ll Do Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What Experience You Need Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in mainstream Java, SpringBoot, TypeScript/JavaScript, HTML, CSS 5+ years experience with Cloud technology: GCP, AWS, or Azure 5+ years experience designing and developing cloud-native solutions 5+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 5+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What Could Set You Apart Self-starter that identifies/responds to priority shifts with minimal supervision. Experience designing and developing big data processing solutions using Dataflow/Apache Beam, Bigtable, BigQuery, PubSub, GCS, Composer/Airflow, and others UI development (e.g. HTML, JavaScript, Angular and Bootstrap) Experience with backend technologies such as JAVA/J2EE, SpringBoot, SOA and Microservices Source code control management systems (e.g. SVN/Git, Github) and build tools like Maven & Gradle. Agile environments (e.g. Scrum, XP) Relational databases (e.g. SQL Server, MySQL) Atlassian tooling (e.g. JIRA, Confluence, and Github) Developing with modern JDK (v1.7+) Automated Testing: JUnit, Selenium, LoadRunner, SoapUI Cloud Certification strongly preferred We offer a hybrid work setting, comprehensive compensation and healthcare packages, attractive paid time off, and organizational growth potential through our online learning platform with guided career tracks. Are you ready to power your possible? Apply today, and get started on a path toward an exciting new career at Equifax, where you can make a difference! Who is Equifax? At Equifax, we believe knowledge drives progress. As a global data, analytics and technology company, we play an essential role in the global economy by helping employers, employees, financial institutions and government agencies make critical decisions with greater confidence. We work to help create seamless and positive experiences during life’s pivotal moments: applying for jobs or a mortgage, financing an education or buying a car. Our impact is real and to accomplish our goals we focus on nurturing our people for career advancement and their learning and development, supporting our next generation of leaders, maintaining an inclusive and diverse work environment, and regularly engaging and recognizing our employees. Regardless of location or role, the individual and collective work of our employees makes a difference and we are looking for talented team players to join us as we help people live their financial best. Equifax is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
15.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Minimum 15+ years of IT Experience and 3years Development on the GCP projects and 5 projects implemented Possess In depth knowledge and hands on development experience in building Distributed Data Solutions including ingestion, processing, consumption) (Must Have) You have experience with developing winning themes and then writing technical responses to bids (RFP’s & RFI’s) Strong Development Experience in either one of the Distributed Big Data processing (bulk) engines preferably using Spark on Dataproc or DataFLow (Must Have) Strong understanding and experience with Cloud Storage infrastructure and operationalizing GCP based storage services & solutions prefer GCP Bucket or related (Must Have) Strong experience on either one or more MPP Data Warehouse Platforms prefer BigQuery, CoudSQL, CLoudSpanner, DataStore, Firestore or similar (Must Have) Strong Development Experience on at least one or more event driven streaming platforms prefer PUB/SUB, Kafka or related (Must Have) Strong Development Experience on the Networking on GCP (Must Have) Strong Data Orchestration experience using tools such has Cloud Functions, DataFlow, Cloud Composer, Apache Airflow or related (Must Have) Strong Development Experience to build data piple using Kubernaties (Must Have) Strong Development Experience in IAM, KSM, Container Registr Assess use cases for various teams within the client company and evaluate pros and cons and justify recommended tooling and component solution options using GCP services, 3rd party and open source solutions (Must Have) Strong technical communication skills and ability to engage a variety of business and technical audiences explaining features, metrics of Big Data technologies based on experience with previous solutions (Must Have) Strong Data Cataloging experience preferably using Data Catalog (Must Have) Strong Understanding and experience in Logging and Cloud Monitoring solutions (Must Have) Strong Understanding of at least one or more Cluster Managers (Yarn, Hive, Pig, etc) (Must Have) Strong knowledge and understanding of CI/CD processes and tools (Must Have) Interface with client project sponsors to gather, assess and interpret client needs and requirements. Advising on database performance, altering the ETL process, providing SQL transformations, discussing API integration, and deriving business and technical KPIs Develop a data model around stated use cases to capture client’s KPIs and data transformations Assess, document and translate goals, objectives, problem statements, etc. to our offshore team and onshore management Show more Show less
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We're seeking a highly skilled and experienced Full Stack Data Engineer to play a pivotal role in the development and maintenance of our Enterprise Data Platform. In this role, you'll be responsible for designing, building, and optimizing scalable data pipelines within our Google Cloud Platform (GCP) environment. You'll work with GCP Native technologies like BigQuery, Dataflow, and Pub/Sub, ensuring data governance, security, and optimal performance. This is a fantastic opportunity to leverage your full-stack expertise, collaborate with talented teams, and establish best practices for data engineering at Ford. Responsibilities Data Pipeline Architect & Builder: Spearhead the design, development, and maintenance of scalable data ingestion and curation pipelines from diverse sources. Ensure data is standardized, high-quality, and optimized for analytical use. Leverage cutting-edge tools and technologies, including Python, SQL, and DBT/Dataform, to build robust and efficient data pipelines. End-to-End Integration Expert: Utilize your full-stack skills to contribute to seamless end-to-end development, ensuring smooth and reliable data flow from source to insight. GCP Data Solutions Leader : Leverage your deep expertise in GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that not only meet but exceed business needs and expectations. Data Governance & Security Champion : Implement and manage robust data governance policies, access controls, and security best practices, fully utilizing GCP's native security features to protect sensitive data. Data Workflow Orchestrator : Employ Astronomer and Terraform for efficient data workflow management and cloud infrastructure provisioning, championing best practices in Infrastructure as Code (IaC). Performance Optimization Driver : Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions, ensuring optimal resource utilization and cost-effectiveness. Collaborative Innovator : Collaborate effectively with data architects, application architects, service owners, and cross-functional teams to define and promote best practices, design patterns, and frameworks for cloud data engineering. Automation & Reliability Advocate : Proactively automate data platform processes to enhance reliability, improve data quality, minimize manual intervention, and drive operational efficiency. Effective Communicator : Clearly and transparently communicate complex technical decisions to both technical and non-technical stakeholders, fostering understanding and alignment. Continuous Learner : Stay ahead of the curve by continuously learning about industry trends and emerging technologies, proactively identifying opportunities to improve our data platform and enhance our capabilities. Business Impact Translator : Translate complex business requirements into optimized data asset designs and efficient code, ensuring that our data solutions directly contribute to business goals. Documentation & Knowledge Sharer : Develop comprehensive documentation for data engineering processes, promoting knowledge sharing, facilitating collaboration, and ensuring long-term system maintainability. Qualifications Bachelor's degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience). 5-7 years of experience in Data Engineering or Software Engineering, with at least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred). Strong proficiency in SQL, Java, and Python, with practical experience in designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and DataProc. Solid understanding of Service-Oriented Architecture (SOA) and microservices, and their application within a cloud data platform. Experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases, and columnar databases (e.g., BigQuery). Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments. Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform and Tekton, and other automation frameworks. Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data platform and microservices issues. Experience in monitoring and optimizing cost and compute resources for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc). A passion for data, innovation, and continuous learning. Show more Show less
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our company is looking for an experienced Senior Data Engineer to join our team. As a Senior Data Engineer, you will be working on a project that focuses on data integration and ETL for cloud-based platforms. You will be responsible for designing and implementing complex data solutions, ensuring that the data is accurate, reliable, and easily accessible. Responsibilities Design and implement complex data solutions for cloud-based platforms Develop ETL processes using SQL, Python, and other relevant technologies Ensure that data is accurate, reliable, and easily accessible for all stakeholders Collaborate with cross-functional teams to understand data integration needs and requirements Develop and maintain documentation, including technical specifications, data flow diagrams, and data mappings Monitor and optimize data integration processes for performance and efficiency, ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Experience with cloud-native or Spark-based ETL tools such as AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Experience with Snowflake for data warehousing Experience with cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Experience with ETL using Python Show more Show less
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our organization is in search of a seasoned Senior Data Engineer to enhance our team. In this role, you will focus on projects involving data integration and ETL for cloud environments. Your primary duties will include the design and execution of intricate data solutions, maintaining data accuracy, dependability, and accessibility. Responsibilities Design and execute intricate data solutions for cloud environments Develop ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, dependability, and accessibility for all stakeholders Work collaboratively with cross-functional teams to comprehend data integration necessities and specifications Create and uphold documentation such as technical specifications, data flow diagrams, and data mappings Optimize data integration processes for enhanced performance and efficiency while ensuring data accuracy and integrity Requirements Bachelor's degree in Computer Science, Electrical Engineering, or related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong knowledge of SQL for data querying and manipulation Familiarity with Snowflake for data warehousing Background in cloud platforms such as AWS, GCP, or Azure for data storage and processing Excellent problem-solving skills and meticulous attention to detail Good verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are seeking a skilled Senior Data Engineer to become an integral part of our team. In this role, you will focus on projects involving data integration and ETL for cloud-based platforms. Your tasks will include creating and executing sophisticated data solutions, ensuring the integrity, reliability, and accessibility of data. Responsibilities Create and execute sophisticated data solutions for cloud-based platforms Construct ETL processes utilizing SQL, Python, and other applicable technologies Maintain data accuracy, reliability, and accessibility for all stakeholders Work with cross-functional teams to comprehend data integration needs and specifications Produce and uphold documentation, such as technical specifications, data flow diagrams, and data mappings Enhance data integration processes for performance and efficiency, upholding data accuracy and integrity Requirements Bachelor’s degree in Computer Science, Electrical Engineering, or a related field 5-8 years of experience in data engineering Proficiency in cloud-native or Spark-based ETL tools like AWS Glue, Azure Data Factory, or GCP Dataflow Strong understanding of SQL for data querying and manipulation Background in Snowflake for data warehousing Familiarity with cloud platforms such as AWS, GCP, or Azure for data storage and processing Exceptional problem-solving abilities and meticulous attention to detail Strong verbal and written communication skills in English at a B2 level Nice to have Background in ETL using Python Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Key Responsibilities: Develop And Maintain Web Applications Build dynamic, user-centric web applications using React, React Hooks, and modern web technologies like HTML5 and CSS3. Ensure that the application is scalable, maintainable, and easy to navigate for end-users. Collaborate With Cross-Functional Teams Work closely with designers and product teams to bring UI/UX designs to life, ensuring the design vision is executed effectively using HTML and CSS. Ensure the application is responsive, performing optimally across all devices and browsers. State Management Utilize Redux to manage and streamline complex application states, ensuring seamless data flow and smooth user interactions. Component Development Develop reusable, modular, and maintainable React components using React Hooks and CSS/SCSS to style components effectively. Build component libraries and implement best practices to ensure code maintainability and reusability. Role Proficiency This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments "Frontend developer Required Skills and Experience: React.js Proficiency: - In-depth knowledge of React.js, JSX, React Hooks, and React Router. - Experience with state management using Redux or similar libraries. - Familiar with React performance optimization techniques, including lazy loading, memoization, and code splitting. - Experience with tools like react-testing-library, NPM (vite, Yup, Formik). CSS Expertise: - Strong proficiency in CSS, including the use of third-party frameworks like Material-UI (MUI) and Tailwind CSS for styling. - Ability to create responsive, visually appealing layouts with modern CSS practices. JavaScript/ES6+ Expertise: - Strong command of modern JavaScript (ES6+), including async/await, destructuring, and class-based components. - Familiarity with other JavaScript frameworks and libraries such as TypeScript is a bonus. Version Control: - Proficient with Git and platforms like GitHub or GitLab, including managing pull requests and version control workflows. API Integration: - Experienced in interacting with RESTful APIs. - Understanding of authentication mechanisms like JWT. Testing: - Able to write unit, integration, and end-to-end tests using tools such as react-testing-library. ------------------------------------------------------------------------------------------------------------------- Basic Qualifications: - At least 3 years of experience working with JavaScript frameworks, particularly React.js. - Experience working in cloud environments (AWS, Azure, Google Cloud) is a plus. - Basic understanding of backend technologies such as Python is advantageous." Skills Cloud Services,Backend Systems,Css Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We are seeking a highly skilled and motivated Lead GCP Data Engineer to join our team. The role is critical to the development of a cutting-edge enterprise data products and solutions. The GCP Data Engineer will design, implement, and maintain scalable, reliable, and efficient data solutions on Google Cloud Platform (GCP). The role focuses on enabling data-driven decision-making by developing ETL/ELT pipelines, managing large-scale datasets, and optimizing data workflows. The ideal candidate is a proactive problem-solver with strong technical expertise in GCP, a passion for data engineering, and a commitment to delivering high-quality solutions aligned with business needs. Job Description: Key Responsibilities : Data Engineering & Development : Design, build, and maintain scalable ETL/ELT pipelines for ingesting, processing, and transforming structured and unstructured data. Implement enterprise-level data solutions using GCP services such as BigQuery, Cloud Storage, Dataflow, Cloud Functions, Cloud Pub/Sub, and Cloud Composer. Develop and optimize data architectures that support real-time and batch data processing. Cloud Infrastructure Management : Manage and deploy GCP infrastructure components to enable seamless data workflows. Ensure data solutions are robust, scalable, and cost-effective, leveraging GCP best practices. Collaboration and Stakeholder Engagement : Work closely with cross-functional teams, including data analysts, data scientists, DevOps, and business stakeholders, to deliver data projects aligned with business goals. Translate business requirements into scalable, technical solutions while collaborating with team members to ensure successful implementation. Quality Assurance & Optimization : Implement best practices for data governance, security, and privacy, ensuring compliance with organizational policies and regulations. Conduct thorough quality assurance, including testing and validation, to ensure the accuracy and reliability of data pipelines. Monitor and optimize pipeline performance to meet SLAs and minimize operational costs. Qualifications and Certifications : Education: Bachelor’s or Master’s degree in Computer Science, Information Technology, Engineering, or a related field. Experience: Minimum of 5 years of experience in data engineering, with at least 3 years working on GCP cloud platforms. Proven experience designing and implementing data workflows using GCP services like BigQuery, Cloud Dataflow, Dataform, Cloud Pub/Sub, and Cloud Composer. Certifications: Google Cloud Professional Data Engineer certification preferred. Key Skills : Mandatory Skills: Advanced proficiency in Python for data pipelines and automation. Strong SQL skills for querying, transforming, and analyzing large datasets. Expertise in GCP services such as BigQuery, Cloud Functions, DBT, Cloud Storage, Dataflow, and Kubernetes (GKE). Hands-on experience with CI/CD tools such as Jenkins, Git, or Bitbucket. Familiarity with workflow orchestration tools like Apache Airflow or Cloud Composer. Nice-to-Have Skills: Experience with other cloud platforms like AWS or Azure. Knowledge of data visualization tools (e.g., Looker, Tableau). Understanding of machine learning workflows and their integration with data pipelines. Soft Skills : Strong problem-solving and critical-thinking abilities. Excellent communication skills to collaborate with technical and non-technical stakeholders. Proactive attitude towards innovation and learning. Ability to work independently and as part of a collaborative team. Location: Mumbai Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Apply with updated CV to hr@bitstringit.com Main Task : Maintain & develop data platforms based on Microsoft Fabric for Business Intelligence & Databricks for real-time data analytics Design, implement and maintain standardized production-grade data pipelines using modern data transformation processes and workflows for SAP, MS Dynamics, on-premise or cloud. Develop an enterprise-scale cloud-based Data Lake for business intelligence solutions. Translate business and customer needs into data collection, preparation and processing requirements. Optimize the performance of algorithms developed by Data Scientists General administration and monitoring of the data platforms Competencies : working with structured & unstructured data experienced in various database technologies (RDBMS, OLAP, Timeseries, etc.) solid programming skills (Python, SQL, Scala is a plus) experience in Microsoft Fabric (incl. Warehouse, Lakehouse, Data Factory, DataFlow Gen2, Semantic Model) and/or Databricks (Spark) proficient in PowerBI experienced working with APIs proficient in security best practices data centric Azure know-how is a plus (Storage, Networking, Security, Billing) Education / experience / language: • Bachelor or Master degree in business informatics, computer science, or equal • A background in software engineering (e.g., agile programming, project organization) and experience with human centered design would be desirable • Extensive experience in handling large data sets • Experience working at least 5 years as a data engineer, preferably in an industrial company • Analytical problem-solving skills and the ability to assimilate complex information • Programming experience in modern data-oriented languages (SQL, Python) • Experience with Apache Spark and DevOps Show more Show less
Posted 1 month ago
47.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Data Engineer (SaaS-Based) Location : Noida (In-office/Hybrid; Client site if required) Experience : 47 years Type : Full-Time | Immediate Joiners Preferred Shift : 3 PM to 12 AM IST Client : Leading Tech Company Good to have : GCP Certified Data Engineer Overview Of The Role As a GCP Data Engineer, you'll focus on solving problems and creating value for the business by building solutions that are reliable and scalable to work with the size and scope of the company. You will be tasked with creating custom-built pipelines as well as migrating on-prem data pipelines to the GCP stack. You will be part of a team tackling intricate problems by designing and deploying reliable and scalable solutions tailored to the company's data landscape. Required Skills 5+ years of industry experience in software development, data engineering, business intelligence, or related field with experience in manipulating, processing, and extracting value from datasets. Extensive experience in doing requirement discovery, analysis and data pipeline solution design. Design, build and deploy internal applications to support our technology life cycle, collaboration and spaces, service delivery management, data and business intelligence among others. Building Modular code for multi usable pipeline or any kind of complex Ingestion Framework used to ease the job to load the data into Datalake or Data Warehouse from multiple sources. Work closely with analysts and business process owners to translate business requirements into technical solutions. Coding experience in scripting and languages (Python, SQL, PySpark). Expertise in Google Cloud Platform (GCP) technologies in the data warehousing space (BigQuery, GCP Workflows, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM). Exposure of Google Dataproc and Dataflow. Maintain highest levels of development practices including: technical design, solution development, systems configuration, test documentation/execution, issue identification and resolution, writing clean, modular and self-sustaining code, with repeatable quality and predictability. Understanding CI/CD Processes using Pulumi, GitHub, Cloud Build, Cloud SDK, Docker Experience with SAS/SQL Server/SSIS is an added advantage. Qualifications Bachelor's degree in Computer Science or related technical field, or equivalent practical experience. GCP Certified Data Engineer (preferred) Excellent verbal and written communication skills with the ability to effectively advocate technical solutions to other engineering teams and business audiences. (ref:hirist.tech) Show more Show less
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Technical Skills: Proficient in Java, angular or any javascript technology with experience in designing and deploying cloud-based data pipelines and microservices using GCP tools like BigQuery, Dataflow, and Dataproc. Ability to leverage best in-class data platform technologies (Apache Beam, Kafka, …) to deliver platform features, and design & orchestrate platform services to deliver data platform capabilities. Service-Oriented Architecture and Microservices: Strong understanding of SOA, microservices, and their application within a cloud data platform context. Develop robust, scalable services using Java Spring Boot, Python, Angular, and GCP technologies. Full-Stack Development: Knowledge of front-end and back-end technologies, enabling collaboration on data access and visualization layers (e.g., React, Node.js). Design and develop RESTful APIs for seamless integration across platform services. Implement robust unit and functional tests to maintain high standards of test coverage and quality. Database Management: Experience with relational (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery. Data Governance and Security: Understanding of data governance frameworks and implementing RBAC, encryption, and data masking in cloud environments. CI/CD and Automation: Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks. Manage code changes with GitHub and troubleshoot and resolve application defects efficiently. Ensure adherence to SDLC best practices, independently managing feature design, coding, testing, and production releases. Problem-Solving: Strong analytical skills with the ability to troubleshoot complex data platform and microservices issues. Certifications (Preferred): GCP Data Engineer, GCP Professional Cloud Responsibilities Design and Build Data Pipelines: Architect, develop, and maintain scalable data pipelines and microservices that support real-time and batch processing on GCP. Service-Oriented Architecture (SOA) and Microservices: Design and implement SOA and microservices-based architectures to ensure modular, flexible, and maintainable data solutions. Full-Stack Integration: Leverage your full-stack expertise to contribute to the seamless integration of front-end and back-end components, ensuring robust data access and UI-driven data exploration. Data Ingestion and Integration: Lead the ingestion and integration of data from various sources into the data platform, ensuring data is standardized and optimized for analytics. GCP Data Solutions: Utilize GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) to build and manage data platforms that meet business needs. Data Governance and Security: Implement and manage data governance, access controls, and security best practices while leveraging GCP’s native row- and column-level security features. Performance Optimization: Continuously monitor and improve the performance, scalability, and efficiency of data pipelines and storage solutions. Collaboration and Best Practices: Work closely with data architects, software engineers, and crossfunctional teams to define best practices, design patterns, and frameworks for cloud data engineering. Automation and Reliability: Automate data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency. Qualifications Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field. Master’s degree or equivalent experience preferred. Show more Show less
Posted 1 month ago
3.0 - 5.0 years
5 - 7 Lacs
Chennai, Tamil Nadu
Work from Office
Duration: 12Months Work Type: Onsite Position Description: We seeking an experienced GCP Data Engineer who can build cloud analytics platform to meet ever expanding business requirements with speed and quality using lean Agile practices. You will work on analyzing and manipulating large datasets supporting the enterprise by activating data assets to support Enabling Platforms and Analytics in the Google Cloud Platform (GCP). You will be responsible for designing the transformation and modernization on GCP, as well as landing data from source applications to GCP. Experience with large scale solution and operationalization of data warehouses, data lakes and analytics platforms on Google Cloud Platform or other cloud environment is a must. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on Google Cloud Platform. Skills Required: Experience in working in an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment. Implement methods for automation of all parts of the pipeline to minimize labor in development and production Experience in analyzing complex data, organizing raw data and integrating massive datasets from multiple data sources to build subject areas and reusable data products Experience in working with architects to evaluate and productionalize appropriate GCP tools for data ingestion, integration, presentation, and reporting Experience in working with all stakeholders to formulate business problems as technical data requirement, identify and implement technical solutions while ensuring key business drivers are captured in collaboration with product management Proficient in Machine Learning model architecture, data pipeline interaction and metrics interpretation. This includes designing and deploying a pipeline with automated data lineage. Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions. Test and compare competing solutions and report out a point of view on the best solution. Integration between GCP Data Catalog and Informatica EDC. Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer, Cloud SQL, Compute Engine, Cloud Functions, and App Engine. Skills Preferred: Strong drive for results and ability to multi-task and work independently Self-starter with proven innovation skills Ability to communicate and work with cross-functional teams and all levels of management Demonstrated commitment to quality and project timing Demonstrated ability to document complex systems Experience in creating and executing detailed test plans Experience Required: 3 to 5 Yrs Education Required: BE or Equivalent
Posted 1 month ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration Career Level - IC3 Responsibilities Job Description As a Technical Lead, you will be working on both offshore and onsite client projects. You will be working in projects which will involve Oracle BI Applications/ FAW or OBIEE/OAC/ ODI Implementation, You will be Interacting with client to understand and gather requirement You will be responsible for technical design, development, and system/integration testing using oracle methodologies Desired Profile End –to-end ODI, OAC and Oracle BI Applications/ FAW implementation experience Expert knowledge of BI Applications/ FAW including basic and advanced configurations with Oracle eBS suite/ Fusion as the source system Expert knowledge of OBIEE/OAC RPD design and reports design Expert knowledge ETL(ODI) design/ OCI DI/ OCI Dataflow Mandatory to have 1 of these skills : PLSQL/ BI Publisher/BI Apps Good to have EDQ, Pyspark skills Architectural Solution Definition Any Industry Standard Certifications will be a plus Good knowledge in Oracle database and development Experience in the database application. Creativity, Personal Drive, Influencing and Negotiating, Problem Solving Building Effective Relationships, Customer Focus, Effective Communication, Coaching Ready to travel as and when required by project Experience 8-12 yrs of Data warehousing and Business Intelligence project experience Having 4-6 years of project experience on BI Applications/ FAW and OBIEE/OAC/ ODI/ OCI DI with at least 2 complete lifecycle implementations 4-6 yrs of specialized BI Applications and OBIEE/OAC/ ODI/ OCI DI customization and solution architecture experience. Worked on Financial, SCM or HR Analytics recently in implementation and configuration About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less
Posted 1 month ago
4.0 - 7.0 years
8 - 14 Lacs
Noida
Hybrid
Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time
Posted 1 month ago
15.0 years
0 Lacs
India
Remote
Job Title: Lead Data Engineering Manager – GCP Cloud Migration Location: [Remote] Experience: 12–15 Years (5+ Years Leading Data Teams, 8+ Years in Data Engineering) Employment Type: Full-Time About the Role: We are seeking an experienced Lead Data Engineering Manager to drive the end-to-end migration of enterprise data platforms, ETL pipelines, and data warehouses to the cloud — with a focus on Google Cloud Platform (GCP) . This role will lead high-performing engineering teams and collaborate with cross-functional stakeholders to architect and execute scalable, secure, and modern data solutions using BigQuery, Dataform, Dataplex , and other cloud-native tools. A background in premium consulting or strategic technology advisory is highly preferred, as this role will engage with executive stakeholders and contribute to data transformation strategies at the enterprise level. Key Responsibilities: Lead and mentor Data Engineering teams across design, development, and deployment of modern cloud data architectures. Drive cloud migration initiatives including re-platforming legacy ETL workflows and on-prem DWHs to GCP-based solutions . Architect and implement scalable data pipelines using BigQuery, Dataform , and orchestration tools. Ensure robust data governance and cataloging practices leveraging Dataplex and other GCP services. Collaborate with data analysts, data scientists, and business stakeholders to enable advanced analytics and ML capabilities. Establish and enforce engineering best practices, CI/CD pipelines, and monitoring strategies. Provide technical leadership, project planning, and resource management to deliver projects on time and within scope. Represent the data engineering function in client or leadership meetings, especially in a consulting or multi-client context. Required Skills & Qualifications: 12–15 years of total experience, with 8+ years in data engineering and 5+ years in team leadership roles. Proven expertise in cloud-based data platforms, especially GCP (BigQuery, Dataflow, Dataform, Dataplex, Cloud Composer, Pub/Sub) . Strong knowledge of modern ETL/ELT practices, data modeling, and pipeline orchestration. Experience with data warehouse modernization and migration from platforms like Teradata, Oracle, or Hadoop to GCP. Familiarity with data governance , metadata management , and data cataloging . Background in consulting or strategic data advisory with Fortune 500 clients preferred. Hands-on skills in SQL, Python, and cloud infrastructure-as-code (e.g., Terraform). Strong communication, stakeholder engagement, and leadership presence. Preferred Qualifications: GCP Data Engineer or Architect Certification. Experience with agile methodologies and DevOps practices. Prior work with multi-cloud or hybrid environments is a plus. Experience in regulated industries (finance, healthcare, etc.) is advantageous. Show more Show less
Posted 1 month ago
4.0 - 7.0 years
10 - 14 Lacs
Noida
Work from Office
Location: Noida (In-office/Hybrid; client site if required) Type: Full-Time | Immediate Joiners Preferred Must-Have Skills: GCP (BigQuery, Dataflow, Dataproc, Cloud Storage) PySpark / Spark Distributed computing expertise Apache Iceberg (preferred), Hudi, or Delta Lake Role Overview: Be part of a high-impact Data Engineering team focused on building scalable, cloud-native data pipelines. You'll support and enhance EMR platforms using DevOps principles, helping deliver real-time health alerts and diagnostics for platform performance. Key Responsibilities: Provide data engineering support to EMR platforms Design and implement cloud-native, automated data solutions Collaborate with internal teams to deliver scalable systems Continuously improve infrastructure reliability and observability Technical Environment: Databases: Oracle, MySQL, MSSQL, MongoDB Distributed Engines: Spark/PySpark, Presto, Flink/Beam Cloud Infra: GCP (preferred), AWS (nice-to-have), Terraform Big Data Formats: Iceberg, Hudi, Delta Tools: SQL, Data Modeling, Palantir Foundry, Jenkins, Confluence Bonus: Stats/math tools (NumPy, PyMC3), Linux scripting Ideal for engineers with cloud-native, real-time data platform experience especially those who have worked with EMR and modern lakehouse stacks.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France