Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
3 - 8 years
5 - 10 Lacs
Bengaluru
Work from Office
Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI to improve performance and efficiency, including but not limited to deep learning, neural networks, chatbots, natural language processing. Must have skills : Google Cloud Machine Learning Services Good to have skills : GCP Dataflow, Google Dataproc, Google Pub/Sub Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education br/>Key Responsibilities :A:Implement and maintain data engineering solutions using BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Collaborate with data scientists to deploy machine learning modelsC:Ensure the scalability and efficiency of data processing pipelines br/> Technical Experience :A:Expertise in BigQuery, Dataflow, Vertex AI, Dataproc, and Pub/SubB:Hands-on experience with data engineering in a cloud environment br/> Professional Attributes :A:Strong problem-solving skills in optimizing data workflowsB:Effective collaboration with data science and engineering teams Qualifications 15 years full time education
Posted 1 month ago
16 - 25 years
18 - 27 Lacs
Bengaluru
Work from Office
Skill required: Tech for Operations - Artificial Intelligence (AI) Designation: AI/ML Computational Science Sr Manager Qualifications: Any Graduation Years of Experience: 16 to 25 years What would you do? You will be part of the Technology for Operations team that acts as a trusted advisor and partner to Accenture Operations. The team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. We work closely with the sales, offering and delivery teams to identify and build innovative solutions.The Tech For Operations (TFO) team provides innovative and secure technologies to help clients build an intelligent operating model, driving exceptional results. Works closely with the sales, offering and delivery teams to identify and build innovative solutions. Major sub deals include AHO(Application Hosting Operations), ISMT (Infrastructure Management), Intelligent AutomationIn Artificial Intelligence, you will be enhancing business results by using AI tools and techniques to performs tasks such as visual perception, speech recognition, decision-making, and translation between languages etc. that requires human intelligence. What are we looking for? Machine Learning Process-orientation Thought leadership Commitment to quality Roles and Responsibilities: In this role you are required to identify and assess complex problems for area(s) of responsibility The individual should create solutions in situations in which analysis requires in-depth knowledge of organizational objectives Requires involvement in setting strategic direction to establish near-term goals for area(s) of responsibility Interaction is with senior management levels at a client and/or within Accenture, involving negotiating or influencing on significant matters Should have latitude in decision-making and determination of objectives and approaches to critical assignments Their decisions have a lasting impact on area of responsibility with the potential to impact areas outside of own responsibility Individual manages large teams and/or work efforts (if in an individual contributor role) at a client or within Accenture Please note that this role may require you to work in rotational shifts Qualifications Any Graduation
Posted 1 month ago
2 - 4 years
5 - 8 Lacs
Pune
Work from Office
We are seeking a talented and motivated AI Engineers to join our dynamic team and contribute to the development of next-generation AI/GenAI based products and solutions. This role will provide you with the opportunity to work on cutting-edge SaaS technologies and impactful projects that are used by enterprises and users worldwide. As a Senior Software Engineer, you will be involved in the design, development, testing, deployment, and maintenance of software solutions. You will work in a collaborative environment, contributing to the technical foundation behind our flagship products and services. Responsibilities: Software Development: Write clean, maintainable, and efficient code or various software applications and systems. GenAI Product Development: Participate in the entire AI development lifecycle, including data collection, preprocessing, model training, evaluation, and deployment.Assist in researching and experimenting with state-of-the-art generative AI techniques to improve model performance and capabilities. Design and Architecture: Participate in design reviews with peers and stakeholders Code Review: Review code developed by other developers, providing feedback adhering to industry standard best practices like coding guidelines Testing: Build testable software, define tests, participate in the testing process, automate tests using tools (e.g., Junit, Selenium) and Design Patterns leveraging the test automation pyramid as the guide. Debugging and Troubleshooting: Triage defects or customer reported issues, debug and resolve in a timely and efficient manner. Service Health and Quality: Contribute to health and quality of services and incidents, promptly identifying and escalating issues. Collaborate with the team in utilizing service health indicators and telemetry for action. Assist in conducting root cause analysis and implementing measures to prevent future recurrences. Dev Ops Model: Understanding of working in a DevOps Model. Begin to take ownership of working with product management on requirements to design, develop, test, deploy and maintain the software in production. Documentation: Properly document new features, enhancements or fixes to the product, and also contribute to training materials. Basic Qualifications: Bachelors degree in Computer Science, Engineering, or a related technical field, or equivalent practical experience. 2+ years of professional software development experience. Proficiency as a developer using Python, FastAPI, PyTest, Celery and other Python frameworks. Experience with software development practices and design patterns. Familiarity with version control systems like Git GitHub and bug/work tracking systems like JIRA. Basic understanding of cloud technologies and DevOps principles. Strong analytical and problem-solving skills, with a proven track record of building and shipping successful software products and services. Preferred Qualifications: Experience with object-oriented programming, concurrency, design patterns, and REST APIs. Experience with CI/CD tooling such as Terraform and GitHub Actions. High level familiarity with AI/ML, GenAI, and MLOps concepts. Familiarity with frameworks like LangChain and LangGraph. Experience with SQL and NoSQL databases such as MongoDB, MSSQL, or Postgres. Experience with testing tools such as PyTest, PyMock, xUnit, mocking frameworks, etc. Experience with GCP technologies such as VertexAI, BigQuery, GKE, GCS, DataFlow, and Kubeflow. Experience with Docker and Kubernetes. Experience with Java and Scala a plus.
Posted 1 month ago
12 - 17 years
14 - 19 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : PySpark Good to have skills : Apache Spark, Python (Programming Language), Google BigQuery Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities: Expected to be an SME Collaborate and manage the team to perform Responsible for team decisions Engage with multiple teams and contribute on key decisions Expected to provide solutions to problems that apply across multiple teams Lead the team in implementing PySpark solutions effectively Conduct code reviews and ensure adherence to best practices Provide technical guidance and mentorship to junior team members Professional & Technical Skills: Must To Have Skills:Proficiency in PySpark, Python (Programming Language), Apache Spark, Google BigQuery Strong understanding of distributed computing and parallel processing Experience in optimizing PySpark jobs for performance Knowledge of data processing and transformation techniques Familiarity with cloud platforms for deploying PySpark applications Additional Information: The candidate should have a minimum of 12 years of experience in PySpark This position is based at our Gurugram office A 15 years full-time education is required Qualifications 15 years full time education
Posted 1 month ago
3 - 8 years
5 - 10 Lacs
Hyderabad
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : Btech Summary : As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work closely with the team to ensure the successful delivery of high-quality solutions. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Collaborate with cross-functional teams to gather and analyze requirements. - Design, develop, and test applications based on business needs. - Troubleshoot and debug applications to ensure optimal performance. - Implement security and data protection measures. - Document technical specifications and user guides. - Stay up-to-date with emerging technologies and industry trends. Professional & Technical Skills: - Must To Have Skills:Proficiency in Google BigQuery. - Strong understanding of SQL and database concepts. - Experience with data modeling and schema design. - Knowledge of ETL processes and data integration techniques. - Familiarity with cloud platforms such as Google Cloud Platform. - Good To Have Skills:Experience with data visualization tools such as Tableau or Power BI. Additional Information: - The candidate should have a minimum of 3 years of experience in Google BigQuery. - This position is based at our Hyderabad office. - A Btech degree is required. Qualifications Btech
Posted 1 month ago
4 - 9 years
16 - 31 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Execute project specific development activities in accordance to applicable standards and quality parameters Developing Reviewing Code Setting up the right environment for the projects. Ensure delivery within schedule by adhering to the engineering and quality standards. Own deliver end to end projects within GCP for Payments Data Platform Once a month available on support rota for a week for GCP 24x7 on call Basic Knowledge on Payments ISO standards Message Types etc Able to work under pressure on deliverables P1 Violations Incidents Should be fluent and clear in communications Written Verbal Should be able to follow Agile ways of working. Must have hands on experience on JAVA GCP Shell script and Python knowledge a plus Have in depth knowledge on Java Spring boot Should have experience in GCP Data Flow Big Table Big Query etc Should have experience on managing large database Should have worked on requirements design develop Event Driven and Near Real time data patterns Ingress Egress Preferred candidate profile
Posted 1 month ago
7 - 12 years
13 - 17 Lacs
Gurugram
Work from Office
Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work Preferred technical and professional experience Create up to 3 bullets MA Intuitive individual with an ability to manage change and proven time management Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Up-to-date technical knowledge by attending educational workshops, reviewing publications (encouraging then to focus on required skills
Posted 1 month ago
1 - 3 years
3 - 6 Lacs
Bengaluru
Work from Office
Skill required: Record To Report - Invoice Processing Designation: Record to Report Ops Associate Qualifications: BCom/MCom Years of Experience: 1 to 3 years Language - Ability: English(Domestic) - Expert What would you do? You will be aligned with our Finance Operations vertical and will be helping us in determining financial outcomes by collecting operational data/reports, whilst conducting analysis and reconciling transactions.Posting journal entries, preparing balance sheet reconciliations, reviewing entries and reconciliations, preparing cash forecasting statement, supporting month end closing, preparing reports and supports in audits.Refers to the systematic handling and management of incoming invoices within a business or organization. It involves tasks such as verifying the accuracy of the invoice, matching it with purchase orders and delivery receipts, and initiating the payment process. Automated systems and software are often employed to streamline and expedite the invoice processing workflow, improving efficiency and reducing the likelihood of errors. What are we looking for? Google Cloud SQL Adaptable and flexible Ability to perform under pressure Problem-solving skills Agility for quick learning Commitment to quality Roles and Responsibilities: In this role you are required to solve routine problems, largely through precedent and referral to general guidelines Your expected interactions are within your own team and direct supervisor You will be provided detailed to moderate level of instruction on daily work tasks and detailed instruction on new assignments The decisions that you make would impact your own work You will be an individual contributor as a part of a team, with a predetermined, focused scope of work Please note that this role may require you to work in rotational shifts Qualification BCom,MCom
Posted 1 month ago
7 - 10 years
16 - 21 Lacs
Mumbai
Work from Office
Position Overview: The Google Cloud Data Engineering Lead role is ideal for an experienced Google Cloud Data Engineer who will drive the design, development, and optimization of data solutions on the Google Cloud Platform (GCP). The role requires the candidate to lead a team of data engineers and collaborate with data scientists, analysts, and business stakeholders to enable scalable, secure, and high-performance data pipelines and analytics platforms. Key Responsibilities: Lead and manage a team of data engineers delivering end-to-end data pipelines and platforms on GCP. Design and implement robust, scalable, and secure data architectures using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Storage. Develop and maintain batch and real-time ETL/ELT workflows using tools such as Apache Beam, Dataflow, or Composer (Airflow). Collaborate with data scientists, analysts, and application teams to gather requirements and ensure data availability and quality. Define and enforce data engineering best practices including version control, testing, code reviews, and documentation. Drive automation and infrastructure-as-code approaches using Terraform or Deployment Manager for provisioning GCP resources. Implement and monitor data quality, lineage, and governance frameworks across the data platform. Optimize query performance and storage strategies, particularly within BigQuery and other GCP analytics tools. Mentor team members and contribute to the growth of technical capabilities across the organization. Qualifications: Education : Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field. Experience : 7+ years of experience in data engineering, including 3+ years working with GCP data services. Proven leadership experience in managing and mentoring data engineering teams. Skills : Expert-level understanding of BigQuery, Dataflow (Apache Beam), Cloud Storage, and Pub/Sub. Strong SQL and Python skills for data processing and orchestration. Experience with workflow orchestration tools (Airflow/Composer). Hands-on experience with CI/CD, Git, and infrastructure-as-code tools (e.g., Terraform). Familiarity with data security, governance, and compliance practices in cloud environments. Certifications : GCP Professional Data Engineer certification.
Posted 1 month ago
4 - 9 years
10 - 14 Lacs
Pune
Hybrid
Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.
Posted 1 month ago
4 - 7 years
6 - 16 Lacs
Bengaluru
Work from Office
Senior Software Engineer Google Cloud Platform (GCP) Location: Bangalore, India Why Join Fossil Group? At Fossil Group, we are part of an international team that dares to dream, disrupt, and deliver innovative watches, jewelry, and leather goods to the world. We're committed to long-term value creation, driven by technology and our core values: Authenticity, Grit, Curiosity, Humor, and Impact. If you are a forward-thinker who thrives in a diverse, global setting, we want to hear from you. Make an Impact (Job Summary + Responsibilities) We are looking for a Senior Software Engineer – GCP to join our growing Cloud & Data Engineering team at Fossil Group . This role involves building scalable cloud-native data pipelines using Google Cloud Platform services, with a focus on Dataflow, Dataproc, BigQuery , and strong development skills in Java, Python, and SQL . You will collaborate with global architects and business teams to design and deploy innovative solutions, supporting data analytics, automation, and transformation needs. What you will do in this role: Design, develop, deploy, and maintain data pipelines and services using GCP technologies including Dataflow, Dataproc, BigQuery, Composer , and others. Translate blueprinting documents and business requirements into scalable and maintainable GCP configurations and solutions. Develop and enhance cloud-based batch/streaming jobs using Java or Python. Collaborate with global architects and cross-functional teams to define solutions and execute projects across development and testing environments. Perform unit testing, integration testing, and resolve issues arising during the QA lifecycle. Work closely with internal stakeholders to gather requirements, present technical solutions, and provide end-to-end delivery. Own and manage project timelines, priorities, and documentation. Continuously improve processes and stay current with GCP advancements and big data technologies. Who You Are (Requirements) Bachelor's degree in Computer Science or related field. 4-7 years of experience as a DB/SQL Developer or Java/Python Developer with strong SQL capabilities. Hands-on experience with GCP components such as Dataflow, Dataproc, BigQuery, Cloud Functions, Composer, Data Fusion . Excellent command of SQL with the ability to write complex queries and perform advanced data transformation. Strong programming skills in Java and/or Python , specifically for building cloud-native data pipelines. Experience with relational and NoSQL databases. Ability to understand and translate business requirements into functional specifications. Familiarity with BI dashboards and Google Data Studio is a plus. Strong problem-solving, communication, and collaboration skills. Self-directed, with a growth mindset and eagerness to upskill in emerging GCP technologies. Comfortable leading meetings, gathering requirements, and managing stakeholder communication across regions. What We Offer Comprehensive Benefits: Includes health and well-being services. Paid Parental Leave & Return to Work Program: Support for new parents and caregivers with paid leave and a flexible phase-back schedule. Generous Paid Time Off: Includes Sick Time, Personal Days, and Summer Flex Fridays. Employee Discounts: Save on Fossil merchandise. EEO Statement At Fossil, we believe our differences not only make us stronger as a team, but also help us create better products and a richer community. We are an Equal Employment Opportunity Employer dedicated to a policy of non-discrimination in all employment practices without regard to age, disability, gender identity or expression, marital status, pregnancy, race, religion, sexual orientation, or any other protected characteristic.
Posted 1 month ago
4 - 9 years
22 - 30 Lacs
Pune
Hybrid
Primary Skills: SQL (Data Analysis and Development) Alternate Skills: Python, Sharepoint, AWS , ETL, Telecom specially Fixed Network domain. Location: Pune Working Persona: Hybrid Experience: 4 to 10 years Core competencies, knowledge and experience: Essential: Strong SQL experience - Advanced level of SQL Excellent data interpretation skills Good knowledge of ETL and a business intelligence, good understanding of range of data manipulation and analysis techniques Working knowledge of large information technology development projects using methodologies and standards Excellent verbal, written and interpersonal communication skills, demonstrating the ability to communicate information technology concepts to non-technology personnel, should be able to interact with business team and share the ideas. Strong analytical, problem solving and decision-making skills, attitude to plan and organize work to deliver as agreed. Ability to work under pressure to tight deadlines. Hands on experience working with large datasets. Able to manage different stakeholders.
Posted 1 month ago
10 - 20 years
25 - 40 Lacs
Hyderabad, Pune, Bengaluru
Work from Office
Hi, Hope you are looking for a job change. We have opening for GCP Data Architect for an MNC in Pan India Location, I'm sharing JD with you. Please have a look and revert with below details and Updated Resume. Apply only if you can join in 10 Days. Its 5-Days Work from Office . We dont process High-Notice period Candidates Role GCP Data Architect Experience: 10+ Years Mode: Permanent Work Location: Pan India Notice Period: immediate to 10 Days Mandatory Skills: : GCP, Architecture Experience , Big Data, Data Modelling , BigQuery Full Name ( As Per Aadhar Card ): Email ID: Mobile Number: Alternate No: Qualification: Graduation Year: Regular Course: Total Experience: Relevant experience: Current Organization: Working as Permanent Employee: Payroll Company:: Experience in GCP : Experience in Architecture : Experience in GCP Data Architecture : Experience in Big Data: Experience in BigQuery : Experience in Data Management : Official Notice period: Serving Notice Period: Current location: Preferred location: Current CTC: Exp CTC: CTC Breakup: Pan Card Number : Date of Birth : Any Offer in Hand: Offered CTC: LWD: Any Offer in hand: Serving Notice Period: Can you join immediately: Ready to work from Office for 5 Days : Job Description: GCP Data Architect We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices. Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. Regards, Rejeesh S Email : rejeesh.s@jobworld.jobs Mobile : +91 - 9188336668
Posted 1 month ago
5 - 9 years
10 - 20 Lacs
Pune, Bengaluru, Delhi / NCR
Hybrid
Mandatory Skills: GCS Composer, BigQuery, Azure, Azure Databricks, ADLS. Experience: 5-10 years Good to have skills Knowledge on CDP (customer data platform) Airline knowledge Design and develop data-ingestion frameworks, real-time processing solutions, and data processing and transformation frameworks leveraging open source tools and data processing frameworks. Hands-on on technologies such as Kafka, Apache Spark (SQL, Scala, Java), Python, Hadoop Platform, Hive, airflow Experience in GCP Cloud Composer, Big Query, DataProc Offer system support as part of a support rotation with other team members. Operationalize open source data-analytic tools for enterprise use. Ensure data governance policies are followed by implementing or validating data lineage, quality checks, and data classification. Understand and follow the company development lifecycle to develop, deploy and deliver the solutions. Minimum Qualifications: Bachelor's degree in Computer Science, CIS, or related field 5-7 years of IT experience in software engineering or related field Experience on project(s) involving the implementation of software development life cycles (SDLC)Core Skills
Posted 1 month ago
6 - 10 years
16 - 25 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Hiring!! Role: Sr Power BI Developer Client : MNC(Fulltime)-Permanent Exp: 6-12 years Noticeperiod: Imm/serving/15 Days Location : Pan India Skills: Power BI Developer power bi dashboards, Power Query, GCP (Big Query)& SQL Server. Power Apps Kindly please fill the below details & share updated cv to mansoor@burgeonits.com Name as per Aadhar card Mobile no & Alternate no Email id & Alternate email Date of birth Pan card no(for client upload)mandatory* Total Exp Rev Exp Current company If any payroll (Name) Notice Period (If Serving any Np , Mention last working day) CCTC ECTC Any offers (Yes/No) If yes how much offer &when joining date Current location & Preferred Location Happy to relocate(Yes/No) Available Interview time slots
Posted 1 month ago
11 - 20 years
45 - 75 Lacs
Gurugram
Work from Office
Role & responsibilities Proven success architecting and scaling complex software solutions, Familiarity with interface design. Experience and ability to drive a project/module independently from an execution stand. Prior experience with scalable architecture and distributed processing. Strong Programming expertise in Python, SQL, Scala Hands-on experience on any major big data solutions like Spark, Kafka, Hive. Strong data management skills with ETL, DWH, Data Quality and Data Governance. Hands-on experience on microservices architecture, docker and Kubernetes as orchestration. Experience on cloud-based data stores like Redshift and Big Query. Experience in cloud solution architecture. Experience on architecture of running spark jobs on k8s and optimization of spark jobs. Experience in MLops architecture/tools/orchestrators like Kubeflow, MLflow Experience in logging, metrics and distributed tracing systems (e.g. Prometheus/Grafana/Kibana) Experience in CI/CD using octopus/teamcity/jenkins Interested candidate can share their updated resume at surinder.kaur@mounttalent.com
Posted 1 month ago
5 - 10 years
20 - 25 Lacs
Bengaluru
Work from Office
About The Role : Job Title Transformation Principal Change Analyst Corporate TitleAVP LocationBangalore, India Role Description We are looking for an experienced Change Manager to lead a variety of regional/global change initiatives. Utilizing the tenets of PMI, you will lead cross-functional initiatives that transform the way we run our operations. If you like to solve complex problems, have a gets things done attitude and are looking for a highly visible dynamic role where your voice is heard and your experience is appreciated, come talk to us What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Responsible for change management planning, execution and reporting adhering to governance standards ensuring transparency around progress status; Using data to tell the story, maintain risk management controls, monitor and communicate initiatives risks; Collaborate with other departments as required to execute on timelines to meet the strategic goals As part of the larger team, accountable for the delivery and adoption of the global change portfolio including by not limited to business case development/analysis, reporting, measurements and reporting of adoption success measures and continuous improvement. As required, using data to tell the story, participate in Working Group and Steering Committee to achieve the right level of decision making and progress/ transparency, establishing strong partnership and collaborative relationships with various stakeholder groups to remove constraints to success and carry forward to future projects. As required, developing and documenting end-to-end roles and responsibilities, including process flow, operating procedures, required controls, gathering and documenting business requirements (user stories)including liaising with end-users and performing analysis of gathered data. Heavily involved in product development journey Your skills and experience Overall experience of at least 7-10 years leading complex change programs/projects, communicating and driving transformation initiatives using the tenets of PMI in a highly matrixed environment Banking / Finance/ regulated industry experience of which at least 2 years should be in change / transformation space or associated with change/transformation initiatives a plus Knowledge of client lifecycle processes, procedures and experience with KYC data structures / data flows is preferred. Experience working with management reporting is preferred. Bachelors degree How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
1 - 5 years
6 - 11 Lacs
Pune
Work from Office
About The Role : Job Title Data Engineer for Private Bank One Data Platform on Google Cloud Corporate TitleAssociate LocationPune, India Role Description As part of one of the internationally staffed agile teams of the Private Bank One Data Platform, you are part of the "TDI PB Germany Enterprise & Data" division. The focus here is on the development, design, and provision of different solutions in the field of data warehousing, reporting and analytics for the Private Bank to ensure that necessary data is provided for operational and analytical purposes. The PB One Data Platform is the new strategic data platform of the Private Bank and uses the Google Cloud Platform as the basis. With Google as a close partner, we are following Deutsche Banks cloud strategy with the aim of transferring or rebuilding a significant share of todays on-prem applications to the Google Cloud Platform. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy, Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Work within software development applications as a Data Engineer to provide fast and reliable data solutions for warehousing, reporting, Customer- and Business Intelligence solutions. Partner with Service/Backend Engineers to integrate data provided by legacy IT solutions into your designed databases and make it accessible to the services consuming those data. Focus on the design and setup of databases, data models, data transformations (ETL), critical Online banking business processes in the context Customer Intelligence, Financial Reporting and performance controlling. Contribute to data harmonization as well as data cleansing. A passion for constantly learning and applying new technologies and programming languages in a constantly evolving environment. Build solutions are highly scalable and can be operated flawlessly under high load scenarios. Together with your team, you will run and develop you application self-sufficiently. You'll collaborate with Product Owners as well as the team members regarding design and implementation of data analytics solutions and act as support during the conception of products and solutions. When you see a process running with high manual effort, you'll fix it to run automated, optimizing not only our operating model, but also giving yourself more time for development. Your skills and experience Mandatory Skills Hands-on development work building scalabledata engineering pipelinesand other data engineering/modellingwork usingJava/Python. Excellent knowledge of SQL and NOSQL databases. Experience working in a fast-paced and Agile work environment. Working knowledge of public cloud environment. Preferred Skills Experience inDataflow (Apache Beam)/Cloud Functions/Cloud Run Knowledge of workflow management tools such asApache Airflow/Composer. Demonstrated ability to write clear code that is well-documented and stored in a version control system (GitHub). Knowledge ofGCS Buckets, Google Pub Sub, BigQuery Knowledge aboutETLprocesses in theData Warehouseenvironment/Data Lakeand how to automate them. Nice to have Knowledge of provisioning cloud resources usingTerraform. Knowledge ofShell Scripting. Experience withGit,CI/CD pipelines,Docker, andKubernetes. Knowledge ofGoogle Cloud Cloud Monitoring & Alerting Knowledge ofCloud Run, Data Form, Cloud Spanner Knowledge of Data Warehouse solution -Data Vault 2.0 Knowledge onNewRelic Excellent analytical and conceptual thinking. Excellent communication skills, strong independence and initiative, ability to work in agile delivery teams. Good communication and experience in working with distributed teams (especially Germany + India) How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team. A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs. About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
3 - 7 years
10 - 14 Lacs
Pune
Work from Office
About The Role : Job Title GCP Data Engineer, AS LocationPune, India Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns. They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy. Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have APIGEE. Good to have Bit Bucket How we'll support you Training and development to help you excel in your career. Coaching and support from experts in your team A culture of continuous learning to aid progression. A range of flexible benefits that you can tailor to suit your needs.
Posted 1 month ago
4 - 9 years
14 - 19 Lacs
Pune
Work from Office
About The Role : We are looking for a passionate and self-motivated Technology Leader to join our team in Accounting domain. Being part of a diverse multi-disciplinary global team, you will collaborate with other disciplines to shape technology strategy, drive engineering excellence and deliver business outcomes. What we'll offer you As part of our flexible scheme, here are just some of the benefits that you'll enjoy * Best in class leave policy * Gender neutral parental leaves * 100% reimbursement under childcare assistance benefit (gender neutral) * Sponsorship for Industry relevant certifications and education * Employee Assistance Program for you and your family members * Comprehensive Hospitalization Insurance for you and your dependents * Accident and Term life Insurance * Complementary Health screening for 35 yrs. and above This role is responsible for Design and Implementation of the high-quality technology solutions. The candidate should have demonstrated technical expertise having excellent problem-solving skills. The candidate is expected to; be a hands-on engineering lead involved in analysis, design, design/code reviews, coding and release activities champion engineering best practices and guide/mentor team to achieve high performance. work closely with Business stakeholders, Tribe lead, Product Owner, Lead Architect to successfully deliver the business outcomes. acquire functional knowledge of the business capability being digitized/re-engineered. demonstrate ownership, inspire others, innovative thinking, growth mindset and collaborate for success. focus on upskilling people, team building and career development. keeping up-to-date with industry trends and developments. Your Skills & Experience: Minimum 15 years of IT industry experience in Full stack development Expert in Java, Spring Boot, NodeJS, SQL/PLSQL, ReactJS, Strong experience in Big data processing Apache Spark, Hadoop, Bigquery, DataProc, Dataflow etc Strong experience in Kubernetes, OpenShift container platform Experience with Databases Oracle, PostgreSQL, MongoDB, Redis/hazelcast, should understand data modeling, normalization, and performance optimization Experience in message queues (RabbitMQ/IBM MQ, JMS) and Data streaming i.e. Kafka, Pub-sub etc Experience of working on public cloud GCP preferred, AWS or Azure Knowledge of various distributed/multi-tiered architecture styles Micro-services, Data mesh, Integration patterns etc Experience on modern software product delivery practices, processes and tooling and BIzDevOps skills such as CI/CD pipelines using Jenkins, Git Actions etc Experience on designing solutions, based on DDD and implementing Clean / Hexagonal Architecture efficient systems that can handle large-scale operation Experience on leading teams and mentoring developers Focus on quality experience with TDD, BDD, Stress and Contract Tests Proficient in working with APIs (Application Programming Interfaces) and understand data formats like JSON, XML, YAML, Parquet etc Advantageous: * Having prior experience in Banking/Finance domain * Having worked on hybrid cloud solutions preferably using GCP * Having worked on product development How we'll support you: * Training and development to help you excel in your career * Coaching and support from experts in your team * A culture of continuous learning to aid progression * A range of flexible benefits that you can tailor to suit your needs About us and our teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment.
Posted 1 month ago
4 - 9 years
16 - 20 Lacs
Pune
Work from Office
About The Role : Job TitleIT Application Owner, AS LocationPune, India Role Description Deutsche Banks Strategy & Innovation Engineering team identifies, evaluates, and incubates cutting-edge technical innovation. It is part of the Chief Strategy Office of the banks Technology, Data & Innovation (TDI) function and works globally with all business lines and infrastructure functions of the bank. A focus of the team is to create value for clients and the bank using Artificial Intelligence, Large Language Models (LLM) and other advanced data-driven technologies. As a ITAO, you will be joining the innovation engineering team and contribute to the supporting and managing of new AI products and services for the entire Deutsche Bank Group. We require technical specialists to help research, design, and implement state of the art AI services, with particular focus on performing technology evaluations of AI products. You will make a real difference for senior stakeholders across core banking functions where computational, complexity and efficiency challenges abound through your own delivery and through the promotion of modern AI development best practices and techniques. Overview We are seeking a talented and experienced AI Engineer to join our team. The ideal candidate will be hands-on and drive design, development, and implementation of AI based solutions for CB Tech. This role involves working with large datasets, conducting experiments, and staying updated with the latest advancements in AI and Machine Learning. This person is expected to innovate and support the Innovation teams Tech efforts in modernizing the engineering landscape by identifying AI use cases and provide local support by owning ITAO role of AI Platform of Bank. If you are carrying engineering mindset, have a passion for AI and want to be part of developing innovative products then apply today. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities The IT Application owner (ITAO) is responsible for the Application Management and has to ensure that the applications are enhanced and maintained in accordance to the Banks IT Policy requirements on the application lifecycle governance. Design, develop, and deploy solutions using Advanced Analytics, Machine Learning / AI and cloud technologies that fulfil Deutsche Bank's Innovation Strategy Contribute to effective and efficient technical research and experiments, through technology evaluations, publishing AI research reports, and building proof-of-concept (POCs) Engage with business stakeholders to identify and evaluate opportunities to create value through innovative solutions Foster adoption of AI and ML by collaborating with cross-functional teams and educating stakeholders on AI driven solutions Stay up-to-date with the latest advancements in AI and Data Science Ongoing enhancement and maintenance of the application including management of scope. Ensuring that the changes to the applications in scope are fully aligned DB standards and regulations. The main focus is to guarantee the system stability and to ensure a smooth and successful transition to production, steady-state environment. Conducting strategic planning for the application. Managing strategic capacity, consumption and performance management (Forecast and management based on business plans). Ensuring policy-compliance for the application. Facilitating and contributing to the audit activities. Managing software licenses, security certificates and contracts with service providers. Ensuring documentation availability. Identifying and managing technical projects necessary to ensure required and established service levels are maintained. Working with development center team to estimate work effort throughout different phases of the functional domain deliverables. Assisting with development of configuration/monitoring/packaging/deployment/automations of AI Platforms. Identifying, documenting and communicating risks and issues discovered during delivery cycle. Your skills and experience Excellent communication and presentation skills, highly organized and disciplined. Experienced in working with multiple stakeholders. Ability to create and naturally maintain good business relationships with all stakeholders. IT Service Management, IT Governance or IT Project Management background. ITAO, TISO roles awareness, Compliance, Risk and Governance concepts with respect to financial industry. Comfortable working in VUCA (Volatility Uncertainty Complexity Ambiguity) and highly dynamic environments. ITAO will typically have a rather limited technical hands on involvement. A high-level understanding on the products/technologies below is welcomed: Google Cloud GKE, Terraform, IAM, BigQuery, Cloud Shell, Cloud Storage AI/ML AI Agents, AI concepts, ML models, AI/ML Concepts, ,Vertex AI, AutoML, BigQuery ML. MLOps & CICD Pipelines, Kubeflow, Vertex AI pipelines Proficiency in Designing, deploying and managing AI agents e..g chatbots, virtual assistants GCP Networking, Networking protocols, Security concepts, VPC, Load balancers Unix servers very basic administration Python, Shell Scripting, SQL Familiarity with fine-tuning and deploying large language models on GCP. Understanding of security best practices, including data governance, encryption, and compliance with AI-related regulations. GCP - Cloud Logging, Cloud Monitoring and AI Model Performance Tracking. 6+ years of work experience in IT; (for AVP 6+, Associate 4+) Strong problem-solving skills and a passion for AI research Good inter-personal skills with ability to co-operate and collaborate together with other teams Educational Qualifications B.E. / B. Tech. / Master's degree in computer science or equivalent Added advantage. GCP Certifications Kubernetes Certifications AI/Ml Educational background or Certifications or higher qualifications How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs
Posted 1 month ago
2 - 5 years
5 - 10 Lacs
Chennai
Work from Office
Req ID: 320304 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Lead Developer to join our team in Chennai, Tamil Nadu (IN-TN), India (IN). Lead .NET Developer - Remote Who We Are NTT DATA America strives to hire exceptional, innovative and passionate individuals who want to grow with us. Launch by NTT DATA is the culmination of the company’s strategy to acquire and integrate the skills, experience, and technology of leading digital companies, backed by NTT DATA’s core capabilities, global reach, and depth. How You’ll Help Us A Lead Application Developer is first and foremost a software developer who specializes in .NET C# development. You’ll be part of a team focused on delivering quality software for our clients. How We Will Help You Joining our Microsoft practice is not only a job, but a chance to grow your career. We will make sure to equip you with the skills you need to produce robust applications that you can be proud of. Whether it is providing you with training on a new programming language or helping you get certified in a new technology, we will help you grow your skills so you can continue to deliver increasingly valuable work. Once You Are Here, You Will The Lead Application Developer provides leadership in full systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery is on time and within budget. You will direct component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements and ensure compliance. This position develops and leads AD project activities and integrations. The Lead Applications Developer guides teams to ensure effective communication and achievement of objectives, in addition to researching and supporting the integration of emerging technologies. This position provides knowledge and support for applications’ development, integration, and maintenance. The Lead Applications Developer will lead junior team members with project related activities and tasks. Additionally, you will guide and influence the department, project teams, and facilitate collaboration with stakeholders. Basic Qualifications 5+ years of experience with Angular/ExtJS 8+ years developing in developing .NET applications in C# 8+ years of experience in designing and developing Restful Webservices leveraging micro-service design and implementation patterns 8+ years of experience with SQL Server 8+ years of experience with PL/SQL Scripting 8+ years of experience with DB reporting leveraging tools such as SSIS 3+ years of experience as being a tech lead such that you have mentored and coached less senior resources. 3+ years of experience with software architectural design. Preferred Experience with GCP in web services Experience with GCP Big Data Experience with GCP BigQuery Experience with PowerBI Ideal Mindset Lifelong LearnerYou are always seeking to improve your technical and nontechnical skills. Team PlayerYou are someone who wants to see everyone on the team succeed and is willing to go the extra mile to help a teammate in need. CommunicatorYou know how to communicate your design ideas to both technical and nontechnical stakeholders, prioritizing critical information and leaving out extraneous details. Please note Shift Timing Requirement1:30pm IST -10:30 pm IST #Launchjobs #LaunchEngineering About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment .NET, Application Developer, Testing, Developer, Information Technology, Technology
Posted 1 month ago
2 - 5 years
6 - 10 Lacs
Hyderabad
Work from Office
Req ID: 319692 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Specialist to join our team in Hyderabad, Telangana (IN-TG), India (IN). Senior Developer Mandatory Skills: GCP, Big Query,Linux Shell Script, SQL Server Desired Skills: ETL/ELT, Python, Agile, MS Office JD: Senior Developers with solid Linux Shell Scripting and SQL experience with knowlegde of ETL/ELT application design. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Business Intelligence, Database, SQL, Linux, Consulting, Technology
Posted 1 month ago
5 - 10 years
9 - 19 Lacs
Bengaluru
Work from Office
Key Responsibilities: Design, develop, and optimize interactive dashboards using Looker and LookerML . Work with BigQuery to create efficient data models and queries for visualization. Develop LookML models, explores, and derived tables to support business intelligence needs. Optimize dashboard performance by implementing best practices in data aggregation and visualization. Collaborate with data engineers, analysts, and business teams to understand requirements and translate them into actionable insights. Implement security and governance policies within Looker to ensure data integrity and controlled access. Leverage Google Cloud Platform (GCP) services to build scalable and reliable data solutions. Maintain documentation and provide training to stakeholders on using Looker dashboards effectively. Troubleshoot and resolve issues related to dashboard performance, data accuracy, and visualization constraints. Maintain and optimize existing Looker dashboards and reports to ensure continuity and alignment with business KPIs Understand, audit, and enhance existing LookerML models to ensure data integrity and performance Build new dashboards and data visualizations based on business requirements and stakeholder input Collaborate with data engineers to define and validate data pipelines required for dashboard development and ensure the timely availability of clean, structured data Document existing and new Looker assets and processes to support knowledge transfer, scalability, and maintenance Support the transition/handover process by acquiring detailed knowledge of legacy implementations and ensuring a smooth takeover Required Skills & Experience: 5-8 years of experience in data visualization and business intelligence using Looker and LookerML . Strong proficiency in writing and optimizing SQL queries , especially for BigQuery . Experience in Google Cloud Platform (GCP) , particularly with BigQuery and related data services. Solid understanding of data modeling, ETL processes, and database structures. Familiarity with data governance, security, and access controls in Looker. Strong analytical skills and the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills. Expertise in Looker and LookerML, including Explore creation, Views, and derived tables Strong SQL skills for data exploration, transformation, and validation Experience in BI solution lifecycle management (build, test, deploy, maintain) Excellent documentation and stakeholder communication skills for handovers and ongoing alignment Strong data visualization and storytelling abilities, focusing on user-centric design and clarity Preferred Qualifications: Experience working in the media industry (OTT, DTH, Web) and handling large-scale media datasets. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus. Experience with Python or scripting languages for automation and data processing. Understanding of machine learning or predictive analytics is an advantage.
Posted 1 month ago
3 - 7 years
20 - 25 Lacs
Pune
Remote
1. Extract and transform data from Google BigQuery and other relevant data sources. 2. Utilize Python and libraries such as Pandas and NumPy to manipulate, clean, and analyze large datasets. 3. Develop and implement Python scripts to automate data extraction, processing, and analysis for comparison reports. 4. Design and execute queries in BigQuery to retrieve specific data sets required for comparison analysis.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.
The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.
Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.
As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.