About the Role:We are seeking a highly skilled and experienced Data Engineering to join our growing team. As a Data Engineering, you will play a critical role in designing, building, and scaling Google's massive data infrastructure and platforms. You will be a technical leader and mentor, driving innovation and ensuring the highest standards of data quality, reliability, and performance.Responsibilities:Design and Architecture:Design and implement scalable, reliable, and efficient data pipelines and architectures for various Google products and services.Develop and maintain data models, schemas, and ontologies to support diverse data sources and use cases.Evaluate and recommend new and emerging data technologies and tools to improve Google's data infrastructure.Collaborate with product managers, engineers, and researchers to define data requirements and translate them into technical solutions.Data Processing and Pipelines:Build and optimize batch and real-time data pipelines using Google Cloud Platform (GCP) services such as Dataflow, Dataproc, Pub/Sub, and Cloud Functions.Develop and implement data quality checks and validation processes to ensure data accuracy and consistency.Design and implement data governance policies and procedures to ensure data security and compliance.Data Storage and Management:Design and implement scalable data storage solutions using GCP services such as BigQuery, Cloud Storage, and Spanner.Optimize data storage and retrieval for performance and cost-effectiveness.Implement data lifecycle management policies and procedures.Team Leadership and Mentorship:Provide technical leadership and guidance to data engineers and other team members.Mentor and coach junior engineers to develop their skills and expertise.Foster a culture of innovation and collaboration within the team.Qualifications:Bachelor's or Master's degree in Computer Science, Engineering, or a related field.8+ years of experience in data engineering or a related field.Strong understanding of data warehousing, data modeling, and ETL processes.Expertise in designing and implementing large-scale data pipelines and architectures.Proficiency in SQL and at least one programming language such as Python or Java.Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage.Experience with open-source data processing frameworks such as Hadoop, Spark, and Kafka.Excellent communication, interpersonal, and collaboration skills.Preferred Qualifications:Experience with data governance and data quality management.Experience with machine learning and data science.Experience with containerization and orchestration technologies such as Docker and Kubernetes.Contributions to open-source projects or communities.Google Cloud Professional Data Engineer certification.
As a highly skilled and innovative Generative AI at Google, you will lead the design, development, and implementation of cutting-edge AI solutions. Your role will involve driving the adoption and advancement of generative AI across various Google products, services, and research initiatives. To excel in this position, you must possess a deep understanding of AI/ML principles, strong engineering skills, and a passion for pushing the boundaries of generative AI capabilities. Your responsibilities will include providing technical leadership in the design, development, and deployment of generative AI solutions. You will need to stay updated on the latest research and advancements in generative AI, identify and evaluate new technologies and tools, and design scalable and reliable architectures for generative AI systems. Additionally, you will lead the development and training of state-of-the-art generative AI models, optimize model performance, and implement responsible AI practices such as fairness, bias detection, and explainability. Collaboration and mentorship are crucial aspects of this role. You will collaborate with researchers, engineers, and product managers, mentor other engineers on generative AI best practices, and contribute to Google's AI community and thought leadership. To qualify for this position, you should have a PhD or Master's degree in Computer Science, Artificial Intelligence, or a related field, a strong understanding of deep learning, natural language processing, and/or computer vision, expertise in generative AI models and frameworks, proficiency in Python and relevant AI/ML libraries, experience with large-scale data processing and distributed systems, as well as excellent communication, interpersonal, and collaboration skills. Preferred qualifications include experience with Google Cloud Platform (GCP) and its AI/ML services, publications or patents in generative AI, contributions to open-source AI projects, and experience with AI ethics and responsible AI development.,
Experience - 7 to 10 years Role Description: Evonence is looking for Generative AI practice head to drive end-to-end AI solution development from architecture and model training to responsible deployment and mentorship. The role balances research, engineering, and leadership, making it ideal for someone with a strong academic background and hands-on experience in production-level AI systems. Key Responsibilities: Lead AI initiatives and drive innovation in generative models. Evaluate and adopt cutting-edge AI trends (e.g., from academic papers, NeurIPS, CVPR). Be a go-to authority for generative AI direction. Design production-grade AI systems using frameworks like transformers, GANs, or diffusion models. Focus on performance, scalability, and security, suggesting deployment on distributed platforms. Build and fine-tune large-scale models. Embed Responsible AI practices: mitigate bias, improve transparency. Act as a bridge between research and product teams. Mentor junior engineers, encourage cross-team collaboration, and shape internal AI strategy. Required Skills: Advanced degree (PhD or Master's) in AI/ML or related fields. Expertise in transformer architectures, GANs, diffusion models. Experience with Python, TensorFlow, JAX, and distributed computing. Solid grasp of natural language processing, computer vision, or multi-modal AI. Good to Have: Experience on Google Cloud Platform (GCP). Publications or patents in top-tier AI venues. Contributions to open-source tools like HuggingFace Transformers, TensorFlow, or Diffusers. Knowledge of AI ethics frameworks and tools like Fairness Indicators, Explainable AI (XAI).
Job Description Company Description Evonence is a Google Cloud partner company founded in 2014 and located in Pune. As one of the fastest-growing partners in the Google Cloud ecosystem, we provide Google Cloud solutions specifically tailored to the mid-market across North America. With a customer base of over 1000 clients, we have deep technical expertise in Google Workspace, Google Cloud Data engineering, Google cloud Gen AI / Vertex API solutions, Google Cloud infrastructure migrations, and Google Cloud app modernization capabilities. Role Description This is a full-time hybrid role as a GCP Data Engineering Lead. As the Data Engineering lead, you will be responsible for leading and managing a team of engineers and overseeing all data engineering projects. Your day-to-day tasks will include team management, team leadership, engineering management, project management, and software development. Qualifications Team Management and Team Leadership skills Engineering Management and Project Management skills Experience with Google Cloud Platform (GCP) and GCP data engineering tools Strong analytical and problem-solving skills Excellent communication and collaboration abilities Bachelors degree in Computer Science, Engineering, or a related field 5 years of Experience automating infrastructure provisioning, DevOps Application development experience with GCP Platform services experience -GCP Big Query, dialogflow, Looker, Data studio, GCP Pub / sub, GCP data proc .
Project Manager (Experience Level 4 to 5 years) Key Responsibilities: Lead end-to-end cloud migration projects, including assessments, planning, resource migration, execution, and post-migration support. Leverage in-depth knowledge of cloud services and tools, particularly GCP, including services like Compute Engine, Cloud SQL, Cloud Run Services/Job/Functions, GKE (Kubernetes), BigQuery, Cloud Storage, VPCs, IAM, and others. Define project scope, deliverables, timeline, and resources in collaboration with stakeholders. Manage project risks, issues, and changes, ensuring adherence to timelines and budgets. Ensure effective communication across all levels of the organization, from executive stakeholders to technical teams. Oversee the migration of workloads, applications, and databases from on-prem or other cloud providers to Google cloud platform. Coordinate the migration of server-based and serverless applications, including Docker-based containers, virtual machines, and microservices. Manage the replication, synchronization, and migration of databases (e.g., RDS, SQL, NoSQL, and cloud-native database services). Ensure the application of cloud best practices for scalability, high availability, and disaster recovery. Manage and lead a cross-functional technical team, including DevOps engineers, infrastructure specialists, and data engineers. Provide guidance and support to technical teams during cloud migration tasks, offering solutions to challenges and ensuring alignment with project goals. Conduct regular project meetings to monitor progress, manage workloads, and resolve roadblocks. Monitor and optimize the cloud infrastructure to ensure cost-efficiency and performance. Regularly update stakeholders on project status, milestones, risks, and resource allocation. Create detailed project documentation, including timelines, roadmaps, and migration strategies. Work closely with business and technical stakeholders to understand project goals and ensure alignment with broader organizational objectives. Required Qualifications: Ability to work independently and collaboratively in a fast-paced, dynamic environment. Proven experience in managing large-scale cloud migration initiatives, particularly involving both server and serverless architectures. Experience with Docker, containerization, and microservices architecture. Brief Hands-on and overall usage level knowledge of cloud platforms (AWS, GCP) services and products, including compute, storage, networking, databases, and serverless technologies. Hands-on experience on Emails, Calendar, Meeting Invites, Chat Space, Share Drive management , preferably Google Workspace ecosystem. Familiarity with at least 20-30 widely used AWS and GCP services, such as: GCP: Compute Engine, Cloud Functions, BigQuery, Cloud Storage, Cloud SQL, Pub/Sub, Cloud Run, etc. AWS: EC2, S3, RDS, Lambda, VPC, CloudFormation, ECS, EKS, IAM, CloudWatch, CloudTrail, etc. Experience with database replication, data migration, and backup strategies in the cloud. Basic understanding of DevOps tools and practices (CI/CD, Jenkins, Terraform, Kubernetes, etc.). Strong proficiency in project management methodologies (Agile, Waterfall, or Hybrid). Strong experience in managing and maintaining Project Charter, Project Plans, Status reports Proficient in creating and managing Milestones, Epics, Stories, Tasks in JIRA or ClickUp like project management tools. Experience in writing user stories, acceptance criteria, Start/End date, Estimation and Actual time spent using ClickUp/JIRA like project management tool. Experience in time estimation, sprint planning and milestone tracking. Experience with project management tools (e.g., JIRA, ClickUp, Zoho). Excellent organizational and multitasking abilities, with the capacity to manage competing priorities. Excellent communication skills, both written and verbal. Strong interpersonal and leadership abilities to manage a diverse technical team and engage with stakeholders. Problem-solving mindset and ability to make data-driven decisions.
As an experienced Data Engineer with over 10 years of experience, your role will involve designing and implementing scalable, reliable, and efficient data pipelines and architectures across various Google products and services. You will be responsible for developing and maintaining data models, schemas, and ontologies to support different data sources and use cases. Your expertise will be crucial in evaluating and recommending new data technologies and tools to enhance infrastructure and capabilities. Key Responsibilities: - Collaborate with product managers, engineers, and researchers to define data requirements and provide robust technical solutions. - Build and optimize batch and real-time data pipelines using Google Cloud Platform (GCP) services such as Dataflow, Dataproc, Pub/Sub, and Cloud Functions. - Implement data quality checks and validation processes to ensure data consistency and accuracy. - Design and enforce data governance policies to uphold data security and regulatory compliance. - Manage scalable storage solutions with GCP services including BigQuery, Cloud Storage, and Spanner. - Optimize data retrieval and storage strategies for enhanced performance and cost-efficiency. - Implement data lifecycle management practices and archival strategies. Qualifications Required: - Bachelors or Masters degree in Computer Science, Engineering, or a related field. - Minimum of 8 years of experience in data engineering or a related discipline. - Strong knowledge of data warehousing, data modeling, and ETL development. - Proven expertise in designing and implementing large-scale data architectures and pipelines. - Proficiency in SQL and at least one programming language such as Python or Java. - Hands-on experience with GCP services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. - Familiarity with open-source data tools like Hadoop, Spark, and Kafka. - Excellent communication and collaboration skills. If you have experience with data governance, data quality frameworks, and compliance, exposure to machine learning and data science workflows, knowledge of containerization and orchestration technologies like Docker and Kubernetes, contributions to open-source projects or technical communities, or possess Google Cloud Professional Data Engineer certification, it would be considered a plus.,
As an experienced Data Engineer with 8+ years of experience, your role will involve designing and implementing scalable, reliable, and efficient data pipelines and architectures across Google products and services. You will be responsible for developing and maintaining data models, schemas, and ontologies to support various data sources and use cases. Additionally, you will evaluate and recommend new data technologies and tools to enhance infrastructure and capabilities. Key Responsibilities: - Collaborate with product managers, engineers, and researchers to define data requirements and deliver robust technical solutions. - Build and optimize batch and real-time data pipelines using Google Cloud Platform (GCP) services like Dataflow, Dataproc, Pub/Sub, and Cloud Functions. - Implement data quality checks and validation processes to ensure consistency and accuracy. - Design and enforce data governance policies to maintain data security and regulatory compliance. - Design and manage scalable storage solutions with GCP services including BigQuery, Cloud Storage, and Spanner. - Optimize data retrieval and storage strategies for performance and cost-efficiency. - Implement data lifecycle management practices and archival strategies. Qualifications Required: - Bachelors or Masters degree in Computer Science, Engineering, or a related field. - Strong knowledge of data warehousing, data modeling, and ETL development. - Proven expertise in designing and implementing large-scale data architectures and pipelines. - Proficiency in SQL and at least one programming language such as Python or Java. - Hands-on experience with GCP services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. - Familiarity with open-source data tools like Hadoop, Spark, and Kafka. - Excellent communication and collaboration skills. In addition to the above responsibilities and qualifications, it is preferred that you have experience with data governance, data quality frameworks, and compliance. Exposure to machine learning and data science workflows, as well as familiarity with containerization and orchestration technologies such as Docker and Kubernetes, would be beneficial. Any contributions to open-source projects or technical communities, along with holding a Google Cloud Professional Data Engineer certification, would be regarded as a plus.,
You are being sought after by Evonence to take on the role of Generative AI Practice Head. In this position, you will be responsible for overseeing the entire AI solution development process, ranging from architecture design and model training to the ethical deployment of the solutions. Your role will involve a combination of research, engineering, and leadership tasks, making it an excellent fit for individuals with a strong academic background and hands-on experience in implementing AI systems at scale. **Key Responsibilities:** - Lead and innovate AI projects focused on generative models. - Stay abreast of the latest advancements in AI by evaluating and incorporating cutting-edge trends from academic research and conferences such as NeurIPS and CVPR. - Serve as the primary expert on generative AI strategies within the organization. - Design and implement production-ready AI systems utilizing frameworks like transformers, GANs, or diffusion models. - Emphasize performance, scalability, and security considerations, with an eye towards deployment on distributed platforms. - Develop and optimize large-scale AI models to meet business objectives. - Implement Responsible AI practices to address bias and enhance transparency. - Act as a liaison between the research and product teams. - Mentor junior team members, foster collaboration across different departments, and help shape the internal AI strategy. **Qualifications Required:** - Possession of an advanced degree (PhD or Master's) in the fields of AI/ML or related disciplines. - Proficiency in working with transformer architectures, GANs, and diffusion models. - Hands-on experience with tools such as Python, TensorFlow, JAX, and distributed computing. - Strong understanding of natural language processing, computer vision, or multi-modal AI applications.,
Location: Pune Experience: 6 to 8 years About the Role: We are seeking a highly skilled and experienced Data Engineering Architect to join our growing team. As a Data Engineering Architect, you will play a critical role in designing, building, and scaling Google's massive data infrastructure and platforms. You will be a technical leader and mentor, driving innovation and ensuring the highest standards of data quality, reliability, and performance. Responsibilities: · Design and Architecture: · Design and implement scalable, reliable, and efficient data pipelines and architectures for various Google products and services. · Develop and maintain data models, schemas, and ontologies to support diverse data sources and use cases. · Evaluate and recommend new and emerging data technologies and tools to improve Google's data infrastructure. · Collaborate with product managers, engineers, and researchers to define data requirements and translate them into technical solutions. · Data Processing and Pipelines: · Build and optimize batch and real-time data pipelines using Google Cloud Platform (GCP) services such as Dataflow, Dataproc, Pub/Sub, and Cloud Functions. · Develop and implement data quality checks and validation processes to ensure data accuracy and consistency. · Design and implement data governance policies and procedures to ensure data security and compliance. · Data Storage and Management: · Design and implement scalable data storage solutions using GCP services such as BigQuery, Cloud Storage, and Spanner. · Optimize data storage and retrieval for performance and cost-effectiveness. · Implement data lifecycle management policies and procedures. · Team Leadership and Mentorship: · Provide technical leadership and guidance to data engineers and other team members. · Mentor and coach junior engineers to develop their skills and expertise. · Foster a culture of innovation and collaboration within the team. Qualifications: · 8+ years of experience in data engineering or a related field. · Strong understanding of data warehousing, data modeling, and ETL processes. · Expertise in designing and implementing large-scale data pipelines and architectures. · Proficiency in SQL and at least one programming language such as Python or Java. · Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. · Experience with open-source data processing frameworks such as Hadoop, Spark, and Kafka. · Excellent communication, interpersonal, and collaboration skills. Preferred Qualifications: · Experience with data governance and data quality management. · Experience with machine learning and data science. · Experience with containerization and orchestration technologies such as Docker and Kubernetes. · Contributions to open-source projects or communities. · Google Cloud Professional Data Engineer certification.
Experience Level - 4 to 6 Years Location - Pune Summary: We are looking for a highly motivated and experienced Technical Project Manager to join our team. The ideal candidate will have a proven track record of success in managing complex, cross-functional projects. In this role, you will be responsible for driving all aspects of project management, including planning, execution, and delivery. You will also be responsible for communicating project plans and risks to stakeholders, and partnering with leadership to identify and solve cross-functional challenges. Responsibilities: Agile Project Management: Utilize Agile methodologies, including Kanban, sprints, and Scrum, to manage project teams effectively. Foster an Agile culture within the team and organization. Project Planning and Execution: Drive all aspects of project management, from initial planning and roadmap development to execution and delivery.Identify and manage project dependencies, milestones, and risks. Cross-Functional Collaboration: Influence and collaborate with cross-functional teams to build commitment, ensure alignment, and manage competing priorities.Allocate resources effectively to support project goals. Strategic Initiatives: Lead and drive strategic, cross-functional initiatives that contribute to the company's growth and success. Communication and Stakeholder Management: Clearly communicate project plans, progress, and risks to stakeholders at all levels. Partner with leadership to identify and resolve cross-functional challenges. Drive all aspects of project management, including planning, incorporating dependencies, roadmaps and key milestones, data collection and analysis, communication, risk management, status tracking, execution and deliveries. Influence and collaborate across teams to effectively build commitment, ensure alignment and progress, manage competing priorities, allocate resources, and keep improving processes and efficiencies. Drive strategic, cross-functional initiatives. Communicate project plans and risks to stakeholders. Partner with leadership to identify and solve cross-functional challenges.
As a Generative AI at Google, you will play a crucial role in designing, developing, and implementing cutting-edge AI solutions. Your deep understanding of AI/ML principles and strong engineering skills will drive the adoption and advancement of generative AI across various Google products and research initiatives. - Provide technical leadership in designing, developing, and deploying generative AI solutions. - Stay updated on the latest research and advancements in generative AI. - Identify and evaluate new generative AI technologies and tools. - Design scalable, reliable, and efficient architectures for generative AI systems. - Select and integrate appropriate generative AI models and frameworks such as transformers, GANs, and diffusion models. - Ensure performance, security, and scalability of generative AI solutions. - Lead the development and training of state-of-the-art generative AI models. - Optimize model performance and fine-tune models for specific tasks and domains. - Implement responsible AI practices including fairness, bias detection, and explainability. - Collaborate with researchers, engineers, and product managers. - Mentor and guide other engineers on generative AI best practices. - Contribute to Google's AI community and thought leadership. Qualifications: - PhD or Master's degree in Computer Science, Artificial Intelligence, or a related field. - Strong understanding of deep learning, natural language processing, and/or computer vision. - Expertise in generative AI models and frameworks like transformers, GANs, and diffusion models. - Proficiency in Python and relevant AI/ML libraries such as TensorFlow, JAX. - Experience with large-scale data processing and distributed systems. - Excellent communication, interpersonal, and collaboration skills. Preferred Qualifications: - Experience with Google Cloud Platform (GCP) and its AI/ML services. - Publications or patents in the field of generative AI. - Contributions to open-source AI projects. - Experience with AI ethics and responsible AI development.,
Experience Level - 4 to 6 Years Location - Pune Summary: We are looking for a highly motivated and experienced Technical Project Manager to join our team. The ideal candidate will have a proven track record of success in managing complex, cross-functional projects. In this role, you will be responsible for driving all aspects of project management, including planning, execution, and delivery. You will also be responsible for communicating project plans and risks to stakeholders, and partnering with leadership to identify and solve cross-functional challenges. Responsibilities: Agile Project Management: Utilize Agile methodologies, including Kanban, sprints, and Scrum, to manage project teams effectively. Foster an Agile culture within the team and organization. Project Planning and Execution: Drive all aspects of project management, from initial planning and roadmap development to execution and delivery.Identify and manage project dependencies, milestones, and risks. Cross-Functional Collaboration: Influence and collaborate with cross-functional teams to build commitment, ensure alignment, and manage competing priorities.Allocate resources effectively to support project goals. Strategic Initiatives: Lead and drive strategic, cross-functional initiatives that contribute to the company's growth and success. Communication and Stakeholder Management: Clearly communicate project plans, progress, and risks to stakeholders at all levels. Partner with leadership to identify and resolve cross-functional challenges. Drive all aspects of project management, including planning, incorporating dependencies, roadmaps and key milestones, data collection and analysis, communication, risk management, status tracking, execution and deliveries. Influence and collaborate across teams to effectively build commitment, ensure alignment and progress, manage competing priorities, allocate resources, and keep improving processes and efficiencies. Drive strategic, cross-functional initiatives. Communicate project plans and risks to stakeholders. Partner with leadership to identify and solve cross-functional challenges.
As a Generative AI & AI/ML Engineer at Evonence, you will be responsible for driving end-to-end AI solution development, from architecture and model training to responsible deployment and mentorship. This role is a perfect blend of research, engineering, and leadership, suitable for individuals with a solid academic background and hands-on experience in production-level AI systems. Key Responsibilities: - Lead AI initiatives and foster innovation in generative models. - Evaluate and incorporate cutting-edge AI trends from academic papers, NeurIPS, CVPR, etc. - Serve as a go-to authority for generative AI direction. - Design production-grade AI systems utilizing frameworks like transformers, GANs, or diffusion models. - Focus on enhancing performance, scalability, and security, proposing deployment on distributed platforms. - Develop and fine-tune large-scale models. - Implement Responsible AI practices to mitigate bias and enhance transparency. - Act as a bridge between research and product teams. - Mentor junior engineers, promote cross-team collaboration, and influence internal AI strategy. Qualifications Required: - Advanced degree (PhD or Master's) in AI/ML or related fields. - Proficiency in transformer architectures, GANs, and diffusion models. - Experience with Python, TensorFlow, JAX, and distributed computing. - Strong understanding of natural language processing, computer vision, or multi-modal AI. Additional Company Details (if present in the JD): Omit this section as there are no additional company details provided in the job description.,
Job Description Evonence is a Google Cloud partner company that specializes in providing Google Cloud solutions to mid-market businesses in North America. Established in 2014, Evonence is recognized as one of the fastest-growing partners in the Google Cloud ecosystem, boasting a customer base of over 1000. The company possesses extensive technical proficiency in Google workspace, Google Cloud infra migrations, and Google Cloud app modernizations. As a Google Cloud AI/ML engineer at Evonence, you will engage in pattern recognition, computer science, neural networks, statistics, and algorithms on a regular basis. Your primary responsibility will involve the development and implementation of AI/ML solutions on the Google Cloud Platform. This full-time hybrid role is based in Pune, with the flexibility for remote work. Key Responsibilities - Utilize pattern recognition, computer science, neural networks, statistics, and algorithms in daily tasks - Develop and implement AI/ML solutions on the Google Cloud Platform Qualifications - 5-7 years of relevant experience - Strong background in pattern recognition, computer science, and neural networks - Proficiency in statistics and algorithms - Experience in developing AI/ML solutions on the Google Cloud Platform - Excellent problem-solving and analytical skills - Strong programming skills in languages such as Python or Java - Knowledge of data preprocessing and feature engineering techniques - Familiarity with machine learning frameworks and libraries - Bachelor's degree in Computer Science, Data Science, or a related field - Google Cloud Platform certification is a plus,
Experience Level - 5 to 7 Years Location - Pune About the Role: We are seeking a highly skilled and experienced Data Engineering to join our growing team. As a Data Engineering, you will play a critical role in designing, building, and scaling Googles massive data infrastructure and platforms. You will be a technical leader and mentor, driving innovation and ensuring the highest standards of data quality, reliability, and performance. Responsibilities: Design and Architecture: Design and implement scalable, reliable, and efficient data pipelines and architectures for various Google products and services. Develop and maintain data models, schemas, and ontologies to support diverse data sources and use cases. Evaluate and recommend new and emerging data technologies and tools to improve Googles data infrastructure. Collaborate with product managers, engineers, and researchers to define data requirements and translate them into technical solutions. Data Processing and Pipelines: Build and optimize batch and real-time data pipelines using Google Cloud Platform (GCP) services such as Dataflow, Dataproc, Pub/Sub, and Cloud Functions. Develop and implement data quality checks and validation processes to ensure data accuracy and consistency. Design and implement data governance policies and procedures to ensure data security and compliance. Data Storage and Management: Design and implement scalable data storage solutions using GCP services such as BigQuery, Cloud Storage, and Spanner. Optimize data storage and retrieval for performance and cost-effectiveness. Implement data lifecycle management policies and procedures. Team Leadership and Mentorship: Provide technical leadership and guidance to data engineers and other team members. Mentor and coach junior engineers to develop their skills and expertise. Foster a culture of innovation and collaboration within the team. Qualifications: Bachelors or Masters degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering or a related field. Strong understanding of data warehousing, data modeling, and ETL processes. Expertise in designing and implementing large-scale data pipelines and architectures. Proficiency in SQL and at least one programming language such as Python or Java. Experience with Google Cloud Platform (GCP) services such as BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Experience with open-source data processing frameworks such as Hadoop, Spark, and Kafka. Excellent communication, interpersonal, and collaboration skills. Preferred Qualifications: Experience with data governance and data quality management. Experience with machine learning and data science. Experience with containerization and orchestration technologies such as Docker and Kubernetes.
We are looking for a highly motivated and experienced Technical Project Manager to join our team. The ideal candidate will have a proven track record of success in managing complex, cross-functional projects. In this role, you will be responsible for driving all aspects of project management, including planning, execution, and delivery. You will also be responsible for communicating project plans and risks to stakeholders, and partnering with leadership to identify and solve cross-functional challenges. Responsibilities: Agile Project Management: Utilize Agile methodologies, including Kanban, sprints, and Scrum, to manage project teams effectively. Foster an Agile culture within the team and organization. Project Planning and Execution: Drive all aspects of project management, from initial planning and roadmap development to execution and delivery.Identify and manage project dependencies, milestones, and risks. Cross-Functional Collaboration: Influence and collaborate with cross-functional teams to build commitment, ensure alignment, and manage competing priorities.Allocate resources effectively to support project goals. Strategic Initiatives: Lead and drive strategic, cross-functional initiatives that contribute to the companys growth and success. Communication and Stakeholder Management: Clearly communicate project plans, progress, and risks to stakeholders at all levels. Partner with leadership to identify and resolve cross-functional challenges. Drive all aspects of project management, including planning, incorporating dependencies, roadmaps and key milestones, data collection and analysis, communication, risk management, status tracking, execution and deliveries. Influence and collaborate across teams to effectively build commitment, ensure alignment and progress, manage competing priorities, allocate resources, and keep improving processes and efficiencies. Drive strategic, cross-functional initiatives. Communicate project plans and risks to stakeholders. Partner with leadership to identify and solve cross-functional challenges.
Experience Level - 0 to 2 Years Location - Pune Role Summary The Entry-Level AI/ML Python Developer will assist the Senior Data Science and Engineering teams in building, training, and deploying robust machine learning models and data-driven solutions. This role is ideal for a recent graduate with strong foundational knowledge in Python programming, core ML algorithms, and a passion for data analysis and innovation. Key Responsibilities Data Pipeline Assistance: Help with data collection, cleaning, preprocessing, and feature engineering using libraries like Pandas and NumPy . Model Implementation: Write, test, and maintain clean and efficient Python code for implementing and experimenting with various machine learning models (e.g., regression, classification, clustering). Code Development: Contribute to the development of production-ready code, focusing on code quality, modularity, and adherence to company standards. Experimentation: Run and document model training experiments, analyze results, and generate performance metrics. Tooling: Utilize version control systems, primarily Git , to manage code repositories and collaborate effectively. Learning & Research: Proactively learn new algorithms, tools, and best practices in the AI/ML space to support team objectives. Required Qualifications & Skills Bachelor's or Master's degree in Computer Science, Data Science, Statistics, or a closely related quantitative field. Technical Skills (Must-Have) Programming: Strong proficiency in Python and familiarity with Object-Oriented Programming (OOP) concepts. ML Fundamentals: Solid understanding of core Machine Learning concepts and algorithms (e.g., Linear/Logistic Regression, Decision Trees, K-Means). Libraries: Hands-on experience with fundamental Python libraries: NumPy, Pandas, Matplotlib, scikit-learn. Version Control: Experience using Git and GitHub/GitLab/Bitbucket. Mathematics: Foundational knowledge of Statistics, Linear Algebra, and Calculus relevant to machine learning. Preferred Skills (Nice-to-Have) Familiarity with deep learning frameworks like TensorFlow or PyTorch. Basic understanding of cloud platforms (AWS, Azure, or GCP) and related services. Experience with query languages like SQL. Completed personal projects or successful participation in hackathons/Kaggle competitions demonstrating practical ML application. .
As an experienced Data Engineer with over 10+ years of experience, you will be responsible for designing and implementing scalable, reliable, and efficient data pipelines and architectures across Google products and services. Your key responsibilities will include: - Developing and maintaining data models, schemas, and ontologies to support a variety of data sources and use cases. - Evaluating and recommending new data technologies and tools to improve infrastructure and capabilities. - Collaborating with product managers, engineers, and researchers to define data requirements and deliver robust technical solutions. - Building and optimizing batch and real-time data pipelines using Google Cloud Platform (GCP) services such as Dataflow, Dataproc, Pub/Sub, and Cloud Functions. - Implementing data quality checks and validation processes to ensure consistency and accuracy. - Designing and enforcing data governance policies to maintain data security and regulatory compliance. - Designing and managing scalable storage solutions with GCP services including BigQuery, Cloud Storage, and Spanner. - Optimizing data retrieval and storage strategies for performance and cost-efficiency. - Implementing data lifecycle management practices and archival strategies. Qualifications required for this role include: - Bachelors or Masters degree in Computer Science, Engineering, or a related field. - 8+ years of experience in data engineering or a related discipline. - Strong knowledge of data warehousing, data modeling, and ETL development. - Proven expertise in designing and implementing large-scale data architectures and pipelines. - Proficiency in SQL and at least one programming language such as Python or Java. - Hands-on experience with GCP services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. - Familiarity with open-source data tools like Hadoop, Spark, and Kafka. - Excellent communication and collaboration skills. In addition to the above qualifications, the following are preferred qualifications: - Experience with data governance, data quality frameworks, and compliance. - Exposure to machine learning and data science workflows. - Experience with containerization and orchestration technologies such as Docker and Kubernetes. - Contributions to open-source projects or technical communities. - Google Cloud Professional Data Engineer certification.,
You will be responsible for designing and implementing scalable, reliable, and efficient data pipelines and architectures across Google products and services. Your role will involve developing and maintaining data models, schemas, and ontologies to support various data sources and use cases. You will evaluate and recommend new data technologies and tools to enhance infrastructure and capabilities. Collaboration with product managers, engineers, and researchers to define data requirements and deliver technical solutions will be a key aspect of your responsibilities. Building and optimizing batch and real-time data pipelines using Google Cloud Platform (GCP) services like Dataflow, Dataproc, Pub/Sub, and Cloud Functions will also fall under your purview. Additionally, you will implement data quality checks and validation processes to ensure consistency and accuracy, as well as design and enforce data governance policies for data security and regulatory compliance. Managing scalable storage solutions with GCP services such as BigQuery, Cloud Storage, and Spanner, and optimizing data retrieval and storage strategies for performance and cost-efficiency will be part of your day-to-day tasks. Implementing data lifecycle management practices and archival strategies will also be crucial to your role. - Design and implement scalable, reliable, and efficient data pipelines and architectures. - Develop and maintain data models, schemas, and ontologies. - Evaluate and recommend new data technologies and tools. - Collaborate with cross-functional teams to define data requirements and deliver technical solutions. - Build and optimize batch and real-time data pipelines using GCP services. - Implement data quality checks and validation processes. - Design and enforce data governance policies. - Manage scalable storage solutions with GCP services. - Optimize data retrieval and storage strategies. - Implement data lifecycle management practices and archival strategies. - Bachelors or Masters degree in Computer Science, Engineering, or a related field. - 8+ years of experience in data engineering or a related discipline. - Strong knowledge of data warehousing, data modeling, and ETL development. - Proven expertise in designing and implementing large-scale data architectures and pipelines. - Proficiency in SQL and at least one programming language such as Python or Java. - Hands-on experience with GCP services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. - Familiarity with open-source data tools like Hadoop, Spark, and Kafka. - Excellent communication and collaboration skills. (Note: No additional details about the company were mentioned in the job description),