Jobs
Interviews

1356 Bigquery Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 5.0 years

0 Lacs

karnataka

On-site

The company Loyalytics is a rapidly growing Analytics consulting and product organization headquartered in Bangalore. They specialize in assisting large retail clients worldwide to capitalize on their data assets through consulting projects and product accelerators. With a team of over 100 analytics practitioners, Loyalytics is at the forefront of utilizing cutting-edge tools and technologies in the industry. The technical team at Loyalytics comprises data scientists, data engineers, and business analysts who handle over 1 million data points daily. The company operates in a massive multi-billion dollar global market opportunity and boasts a leadership team with a combined experience of over 40 years. Loyalytics has gained a strong reputation in the market, with word-of-mouth and referral-driven marketing strategies that have attracted prestigious retail brands in the GCC regions like Lulu and GMG. One of the key distinguishing factors of Loyalytics is its 10-year history as a bootstrapped company that continues to expand its workforce, currently employing over 100 individuals. They are now seeking a passionate and detail-oriented BI Consultant Tableau with 1-2 years of experience to join their analytics team. The ideal candidate for this role should have a solid foundation in SQL and hands-on expertise in developing dashboards using Tableau. Responsibilities include designing, developing, and maintaining interactive dashboards and reports, writing efficient SQL queries, collaborating with cross-functional teams, ensuring data accuracy, and optimizing dashboard performance. Strong analytical and problem-solving skills, along with good communication and documentation abilities, are essential for success in this position. Required skills and qualifications for the BI Consultant Tableau role at Loyalytics include 1-2 years of professional experience in BI/Data Analytics roles, proficiency in writing complex SQL queries, hands-on experience with Tableau Desktop, understanding of data modeling concepts and ETL workflows, familiarity with other BI tools like Power BI and Qlik, exposure to Tableau Server or Tableau Cloud, and knowledge of cloud platforms or databases such as AWS, GCP, Azure, Snowflake, or BigQuery. This is an exciting opportunity to join a dynamic and innovative team at Loyalytics and contribute to transforming data into valuable insights for clients in the retail industry.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer, your primary responsibility will be to design and develop robust ETL pipelines using Python, PySpark, and various Google Cloud Platform (GCP) services. You will be tasked with building and optimizing data models and queries in BigQuery to support analytics and reporting needs. Additionally, you will play a crucial role in ingesting, transforming, and loading structured and semi-structured data from diverse sources. Collaboration with data analysts, scientists, and business teams is essential to grasp and address data requirements effectively. Ensuring data quality, integrity, and security across cloud-based data platforms will be a key part of your role. You will also be responsible for monitoring and troubleshooting data workflows and performance issues. Automation of data validation and transformation processes using scripting and orchestration tools will be a significant aspect of your day-to-day tasks. Your hands-on experience with Google Cloud Platform (GCP), particularly BigQuery, will be crucial. Proficiency in Python and/or PySpark programming, along with experience in designing and implementing ETL workflows and data pipelines, is required. A strong command of SQL and data modeling for analytics is essential. Familiarity with GCP services like Cloud Storage, Dataflow, Pub/Sub, and Composer will be beneficial. An understanding of data governance, security, and compliance in cloud environments is also expected. Experience with version control using Git and agile development practices will be advantageous for this role.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Zimetrics is a technology services and solutions provider specializing in Data, AI, and Digital. We help enterprises leverage the economic potential and business value of data from systems, machines, connected devices, and human-generated content. Our core principles are Integrity, Intellect, and Ingenuity, guiding our value system, engineering expertise, and organizational behavior. We are problem solvers and innovators who challenge conventional wisdom and believe in possibilities. You will be responsible for designing scalable and secure cloud-based data architecture solutions. Additionally, you will lead data modeling, integration, and migration strategies across platforms. It will be essential to engage directly with clients to understand their business needs and translate them into technical solutions. Moreover, you will support sales/pre-sales teams with solution architecture, technical presentations, and proposals. Collaboration with cross-functional teams including engineering, BI, and product will also be a part of your role. Ensuring best practices in data governance, security, and performance optimization is a key responsibility. To be successful in this role, you must have strong experience with Cloud platforms such as AWS, Azure, or GCP. A deep understanding of Data Warehousing concepts and tools like Snowflake, Redshift, BigQuery, etc., is essential. Proven expertise in data modeling, including conceptual, logical, and physical modeling, is required. Excellent communication and client engagement skills are a must. Previous experience in pre-sales or solution consulting will be advantageous. You should also have the ability to present complex technical concepts to non-technical stakeholders effectively.,

Posted 2 weeks ago

Apply

7.0 - 12.0 years

22 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities BigQuery for building and optimizing data warehouses. Implement both batch and real-time (streaming) data processing solutions using Java. Cloud Composer (Airflow) for workflow orchestration and pipeline management. Dataproc for managing Apache Spark jobs in the cloud. Google Cloud Storage (GCS) for data storage and management.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

3 - 7 Lacs

Gurugram

Work from Office

- Grade Specific Key Responsibilities :- All Technologies and Sub technologies within that specific Architecture Work with the relevant BU and Strategic Partners stakeholders to build and refresh the Learning Maps (LMs) Create/ Evaluate Quiz Working with the lab team to build the relevant LABs and Demos required to go into the partner enablement Learning Maps. Desired technical and interpersonal skills include, but are not limited to: 1.BE with hands on experience in Cisco technologies- NOC/Deployment/Troubleshoot/Design & Implement - Cisco Meraki, SDWAN, ACI, Nexus 2.CCNA and/or CCNP Routing & Switching certifications (preferred) 3.Strong communication skills 4.Very Good understanding on Cisco Architectures (EN/Sec/SP) and Solutions 5.Desire and ability to learn new technology and solutions.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 30 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Work Location: Bangalore/Pune/Hyderabad/ NCR Experience: 5-12yrs Required Skills: Proven experience as a Data Engineer with expertise in GCP. Strong understanding of data warehousing concepts and ETL processes. Experience with BigQuery, Dataflow, and other GCP data services Design, develop, and maintain data pipelines on GCP. Implement data storage solutions and optimize data processing workflows. Ensure data quality and integrity throughout the data lifecycle. Collaborate with data scientists and analysts to understand data requirements. Monitor and maintain the health of the data infrastructure. Troubleshoot and resolve data-related issues. Thanks & Regards Suganya R Suganya@spstaffing.in

Posted 2 weeks ago

Apply

8.0 - 12.0 years

20 - 35 Lacs

Pune

Work from Office

Work Experience 10 - 15 years in Telecommunications or relevant IT industry 5 years’ experience working in Cloud services Technical / Professional Skills Knowledge and understanding of Public Cloud pricing schemes. Experience with native and 3rd party Public Cloud cost management & advisory tools (Ternary, GCP Billing, GCP big query, big query, GCP Recommender, AWS Cost Explorer, AWS Trusted Advisor) Experience with data analysis and data visualisation tools (DataStudio, BigQuery, PowerBI, Tableau etc.) Have good experience in GCP and/or AWS and/or Azure. Capable to consolidate data and deliver aggregate view/ reports. High proficiency with MS Excel, PowerPoint and Power BI Knowledge and practical experience with SQL, Python

Posted 2 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Gurugram

Work from Office

About the Role: Grade Level (for internal use): 09 Department overview AutomotiveMastermind provides U.S. automotive dealers with AI/behavior prediction analytics software and marketing solutions that improve the vehicle purchase process and results. The companys cloud-based technology helps dealers precisely predict automobile-buying behavior and automates the creation of microtargeted customer communications, leading to proven higher sales and more consistent customer retention. Responsibilities: Work closely with Product Management and Data Strategy leadership to understand short and long-term roadmaps, and overall Data product strategy Drive backlog grooming agile sprint ceremony, acting as bridge between business needs and technical implementation Present on behalf of agile teams in sprint review, reiterating business value delivered with each work increment completed Develop expertise on the existing aM ecosystem of integrations and data available within the system Collaborate with data analysts, data management, data science, and engineering teams to develop short and long-term solutions to meet business needs and solve distinct problems Application of deep, creative, rigorous thinking to solve broad, platform-wide technical and/or business problems Identify key value drivers and key opportunities for/sources of error across products and processes Develop short-term preventive or detective measures, and leading medium/long-term product improvement initiatives arrived at via close collaboration with engineering, QA, and data support Coordinate with data engineers as appropriate to design and enable repeatable processes and generate deliverables to answer routine business questions What Were Looking For: Basic Required Qualifications: Minimum 4 years working experience as a Product Owner or Product Manager in an Agile scrum framework Experience using data and analytical processes to drive decision making, with ability to explain how analysis was done to an executive audience Strong knowledge of Agile development framework, with practical experience to support flexible application of principles Strong conceptual understanding of data integrations technologies and standards Working familiarity with road-mapping and issue tracking software applications (Aha!, MS Azure DevOps, Salesforce) Familiarity with Microsoft Excel, SQL, BigQuery, MongoDB, and Postman preferred An advocate for the importance of leveraging data, a supporter of the use of data analysis in decision-making, and a fierce promoter of data and engineering best practices throughout the organization. Passionate about empirical research A team player who is comfortable working with a globally distributed team across time zones A solid communicator, both with technology teams and with non-technical stakeholders PreferredExperience with or awareness of and interest in dimensional data modeling concepts B.tech/M.tech qualified. Grade9 LocationGurgaon Hybrid Modetwice a week work from office Shift Time12 pm to 9 pm IST About automotiveMastermind Who we are: Founded in 2012, automotiveMastermind is a leading provider of predictive analytics and marketing automation solutions for the automotive industry and believes that technology can transform data, revealing key customer insights to accurately predict automotive sales. Through its proprietary automated sales and marketing platform, Mastermind, the company empowers dealers to close more deals by predicting future buyers and consistently marketing to them. automotiveMastermind is headquartered in New York City. For more information, visit automotivemastermind.com. At automotiveMastermind, we thrive on high energy at high speed. Were an organization in hyper-growth mode and have a fast-paced culture to match. Our highly engaged teams feel passionately about both our product and our people. This passion is what continues to motivate and challenge our teams to be best-in-class. Our cultural values of Drive and Help have been at the core of what we do, and how we have built our culture through the years. This cultural framework inspires a passion for success while collaborating to win. What we do: Through our proprietary automated sales and marketing platform, Mastermind, we empower dealers to close more deals by predicting future buyers and consistently marketing to them. In short, we help automotive dealerships generate success in their loyalty, service, and conquest portfolios through a combination of turnkey predictive analytics, proactive marketing, and dedicated consultative services. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- 20 - Professional (EEO-2 Job Categories-United States of America), PDMGDV202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 2 weeks ago

Apply

10.0 - 12.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Overview PepsiCo Data BI & Integration Platforms is seeking an experienced Cloud Platform technology leader, responsible for overseeing the design, deployment, and maintenance of Enterprise Data Foundation cloud infrastructure initiative on Azure/AWS. The ideal candidate will have hands-on experience with AWS/GCP services Infrastructure as Code (IaC), platform provisioning & administration, cloud network design, cloud security principles and automation. Responsibilities Provide guidance and support for application migration, modernization, and transformation projects, leveraging cloud-native technologies and methodologies. Implement cloud infrastructure policies, standards, and best practices, ensuring cloud environment adherence to security and regulatory requirements. Design, deploy and optimize cloud-based infrastructure using AWS/GCP services that meet the performance, availability, scalability, and reliability needs of our applications and services. Drive troubleshooting of cloud infrastructure issues, ensuring timely resolution and root cause analysis by partnering with global cloud center of excellence & enterprise application teams, and PepsiCo premium cloud partners (AWS,GCP). Establish and maintain effective communication and collaboration with internal and external stakeholders, including business leaders, developers, customers, and vendors. Develop Infrastructure as Code (IaC) to automate provisioning and management of cloud resources. Write and maintain scripts for automation and deployment using PowerShell, Python, or GCP/AWS CLI. Work with stakeholders to document architectures, configurations, and best practices. Knowledge of cloud security principles around data protection, identity and access Management (IAM), compliance and regulatory, threat detection and prevention, disaster recovery and business continuity. Performance TuningMonitor performance, identify bottlenecks, and implement optimizations. Capacity PlanningPlan and manage cloud resources to ensure scalability and availability. Database Design and DevelopmentDesign, develop, and implement databases in Azure/AWS. Manage cloud platform operations with a focus on FinOps support, optimizing resource utilization, cost visibility, and governance across multi-cloud environments. Qualifications Bachelors degree in computer science. At least 10 to 12 years of experience in IT cloud infrastructure, architecture and operations, including security, with at least 8 years in a technical leadership role Strong knowledge of cloud architecture, design, and deployment principles and practices, including microservices, serverless, containers, and DevOps. Deep expertise in AWS/GCP big data & analytics technologies, including Databricks, real time data ingestion, data warehouses, serverless ETL, No SQL databases, DevOps, Kubernetes, virtual machines, web/function apps, monitoring and security tools. Strong understanding of cloud cost management, with hands-on experience in usage analytics, budgeting, and cost optimization strategies across multi-cloud platforms. Proficiency along with hands experience on google cloud integration tools, GCP platform, workspace administration, Apigee integration management, Security Saas tools, Big Query and other GA related tools. Deep expertise in AWS/GCP networking and security fundamentals, including network endpoints & network security groups, firewalls, external/internal DNS, load balancers, virtual networks and subnets. Proficient in scripting and automation tools, such as PowerShell, Python, Terraform, and Ansible. Excellent problem-solving, analytical, and communication skills, with the ability to explain complex technical concepts to non-technical audiences. Certifications in AWS/GCP platform administration, networking and security are preferred. Strong self-organization, time management and prioritization skills A high level of attention to detail, excellent follow through, and reliability Strong collaboration, teamwork and relationship building skills across multiple levels and functions in the organization Ability to listen, establish rapport, and credibility as a strategic partner vertically within the business unit or function, as well as with leadership and functional teams Strategic thinker focused on business value results that utilize technical solutions Strong communication skills in writing, speaking, and presenting Capable to work effectively in a multi-tasking environment. Fluent in English language.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Overview FOBO businesses in Europe, AMESA and APAC have migrated its planning capability from XLS to MOSAIC, an integrated and digital planning tool, in a step forward towards reaching the Financial Planning 2025 Vision. However, the underlaying FOBO operating model limits our ability to capture benefits given the high attrition and lack of process standardization. To become more capable, agile, and efficient a fundamental change in the way we do FOBO Financial Planning is required, which will be addressed by establishing the FOBO Planning Central (FPC). FPC evolves the GBS approach, pivoting from a geography focus to a process focus, and allows BUs to concentrate their attention on the Bottlers. Planning services will be provided by a single team, based in HBS, led by a single leader to serve FOBO globally. The central planning team will be organized around key processes under 3 roles to drive efficiency and standardization NavigatorsSingle point of contact for the BU, responsible for overall planning and analysis activities IntegratorsWorks with Navigator to support business closing activities, reporting & planning Ecosystem AdminOwns TM1 data quality and overall system administration This new operating model will provide a better and faster response to BUs. In addition, it will reduce overall people cost, as some positions will be eliminated due to process standardization and simplification while other positions will migrate from BUs (RetainCo) to the FPC (at HBS). Responsibilities Ensures excellent TM1 data quality and timely overall system administration is delivered for EUROPE/AMESA/APAC FOBO businesses, which includes the following activities TM1 Admin TM1 Scenario Management (eg Create/officialise scenarios, copy actuals into fcst scenario, etc) TM1 Cubes flows execution and Export data to SPOT-Cockpit on a daily basis Perform Systems Reconciliation to ensure 100% financial data alignment between ERP, HFM, TM1 and Cockpit Master Data Perform daily Data quality checks/corrections/reconciliations (before/during closing and planning cycles) Work closely with Navigators to maintain Mappings/allocations in TM1 updated (aligning any changes with business FP&A leads) Maintenance of master data (e.g. profit centres, creation of new NPD, etc) Qualifications 4-6 years experience in Finance position (experience in FOBO business a plus) BA required (Business/Finance or IT) TM1 experience a MUST Comfortable dealing with big/complex data Detailed oriented, and strong analytical skills (quick understanding of E2E process/data flow analysis) Tech savy/passionate for systems, digital tools Excellent communications, interpersonal skills and stakeholder management 100% fluent in English

Posted 2 weeks ago

Apply

7.0 - 12.0 years

6 - 12 Lacs

Mohali

Remote

Role: GCP Data Engineer Experience - 7+Years Location: Remote Shift: 3Pm- 12 AM

Posted 2 weeks ago

Apply

3.0 - 5.0 years

2 - 3 Lacs

Kolkata

Work from Office

Qualification BCA. MCA preferable Required Skill Set 5+ years in Data Engineering, with at least 2 years on GCP/BigQuery Strong Python and SQL expertise (Airflow, dbt or similar) Deep understanding of ETL patterns, change-data-capture, and data-quality frameworks Experience with IoT or time-series data pipelines a plus Excellent communication skills and track record of leading cross-functional teams Job Description / Responsibilities Design, build, and maintain scalable ETL/ELT pipelines in Airflow and BigQuery Define and enforce data-modeling standards, naming conventions, and testing frameworks Develop and review core transformations: IoT enrichment (batch-ID assignment, stage tagging) Transactional ETL (ERPNext/MariaDB BigQuery) Finance automation pipelines (e.g., bank reconciliation) Create and manage schema definitions for staging, enriched_events, and erp_batch_overview tables Implement data-quality tests (using dbt or custom Airflow operators) and oversee QA handoff Collaborate closely with DevOps to ensure CI/CD, monitoring, and cost-efficient operations Drive documentation, runbooks, and knowledge transfer sessions Mentor and coordinate with freelance data engineers and analytics team members Desired profile of the Proficiency in Python and SQL , including working with Airflow and dbt or similar tools. Strong understanding of ETL/ELT design patterns , CDC (Change Data Capture) , and data governance best practices. Excellent communication skills and the ability to translate technical requirements into business outcomes.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 20 Lacs

Hyderabad, Pune

Hybrid

Warm Greetings from SP Staffing!! Role: GCP Data Engineer Experience Required : 5 to 12 yrs Work Location :Pune/Hyderabad Required Skills, GCP + Pyspark/ GCP + Big query SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com

Posted 2 weeks ago

Apply

3.0 - 6.0 years

13 - 18 Lacs

Bengaluru

Work from Office

We are looking to hire Data engineer for the Platform Engineering team. It is a collection of highly skilled individuals ranging from development to operations with a security first mindset who strive to push the boundaries of technology. We champion a DevSecOps culture and raise the bar on how and when we deploy applications to production. Our core principals are centered around automation, testing, quality, and immutability all via code. The role is responsible for building self-service capabilities that improve our security posture, productivity, and reduce time to market with automation at the core of these objectives. The individual collaborates with teams across the organization to ensure applications are designed for Continuous Delivery (CD) and are well-architected for their targeted platform which can be on-premise or the cloud. If you are passionate about developer productivity, cloud native applications, and container orchestration, this job is for you! Principal Accountabilities: The incumbent is mentored by senior individuals on the team to capture the flow and bottlenecks in the holistic IT delivery process and define future tool sets Skills and Software Requirements: Experience with a language such as Python, Go,SQL, Java, or Scala GCP data services (BigQuery; Dataflow; Dataproc; Cloud Composer; Pub/Sub; Google Cloud Storage; IAM) Experience with Jenkins, Maven, Git, Ansible, or CHEF Experience working with containers, orchestration tools (like Kubernetes, Mesos, Docker Swarm etc.) and container registries (GCE, Docker hub etc.) Experience with [SPI]aaS- Software-as-a-Service, Platform-as-a-Service, or Infrastructure-as- a-Service Acquire, cleanse, and ingest structured and unstructured data on the cloud Combine data from disparate sources in to a single, unified, authoritative view of data (e.g., Data Lake) Enable and support data movement from one system service to another system service Experience implementing or supporting automated solutions to technical problems Experience working in a team environment, proactively executing on tasks while meeting agreed delivery timelines Ability to contribute to effective and timely solutions Excellent oral and written communication skills

Posted 2 weeks ago

Apply

10.0 - 14.0 years

25 - 30 Lacs

Pune

Work from Office

We are seeking a highly experienced Principal Solution Architect to lead the design, development, and implementation of sophisticated cloud-based data solutions for our key clients. The ideal candidate will possess deep technical expertise across multiple cloud platforms (AWS, Azure, GCP), data architecture paradigms, and modern data technologies. You will be instrumental in shaping data strategies, driving innovation through areas like GenAI and LLMs, and ensuring the successful delivery of complex data projects across various industries. Key Responsibilities: Solution Design & Architecture: Lead the architecture and design of robust, scalable, and secure enterprise-grade data solutions, including data lakes, data warehouses, data mesh, and real-time data pipelines on AWS, Azure, and GCP. Client Engagement & Pre-Sales: Collaborate closely with clients to understand their business challenges, translate requirements into technical solutions, and present compelling data strategies. Support pre-sales activities, including proposal development and solution demonstrations Data Strategy & Modernization: Drive data and analytics modernization initiatives, leveraging cloud-native services, Big Data technologies, GenAI, and LLMs to deliver transformative business value Industry Expertise: Apply data architecture best practices across various industries (e.g., BFSI, Retail, Supply Chain, Manufacturing) Requireme ntsRequired Qualifications & Skills Experience: 10+ years of experience in IT, with a significant focus on data architecture, solution architecture, and data engineering. Proven experience in a principal-level or lead architect role Cloud Expertise: Deep, hands-on experience with major cloud platforms: Azure: (Microsoft Fabric, Data Lake, Power BI, Data Factory, Azure Purview ), good understanding of Azure Service Foundry, Agentic AI, copi lotGCP: (Big Query, Vertex.AI, Gemini) Data Science Leadership: Understanding and experience in integrating AI/ML capabilities, including GenAI and LLMs, into data solutions Leadership & Communication: Exceptional communication, presentation, and interpersonal skills. Proven ability to lead technical teams and manage client relationships Problem-Solving: Strong analytical and problem-solving abilities with a strategic minds Education: Bachelors or masters degree in computer science, Engineering, Information Technology, or a related field Preferred Qualifications Relevant certifications in AWS, Azure, GCP, Snowflake, or Databricks Experience with Agentic AI, hyper-intelligent automation

Posted 2 weeks ago

Apply

4.0 - 8.0 years

18 - 32 Lacs

Chennai

Work from Office

Designation : Senior Software Engineer Experience in Years : 4 to 7 years Job Location : Chennai (Hybrid) Role & Responsibilities: Develop, enhance, modify, and maintain applications in the Global Markets environment. Design, code, test, debug, and document programs, while also supporting activities for the corporate systems architecture. Partner with business teams to define requirements for system applications. Create clear and comprehensive technical specifications and documentation. Maintain in-depth knowledge of current development tools, languages, and frameworks. Supervise and mentor a small team of associates, providing coaching and performance management input. Stay up to date on new technology trends and research best practices to achieve optimal results. Perform additional technical duties as required Skills - Required: Languages: Python Frontend frameworks: Angular / React Experience with CI/CD tools and pipelines e.g. Tekton, GIT Action, Cloud Build, etc. Infrastructure prvisioning and maintenance through Terraform Cloud Run/ CaaS BigQuery Management of Dev, QA and Production evironments Follow best practices and coding standards. Maintain Github and JIRA" Java, API microservices - good to have

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

gwalior, madhya pradesh

On-site

As a Data Engineer at Synram Software Services Pvt. Ltd., a subsidiary of FG International GmbH, you will be an integral part of our team dedicated to providing innovative IT solutions in ERP systems, E-commerce platforms, Mobile Applications, and Digital Marketing. We are committed to delivering customized solutions that drive success across various industries. In this role, you will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure. Working closely with data analysts, data scientists, and software engineers, you will facilitate data-driven decision-making throughout the organization. Your key responsibilities will include developing, testing, and maintaining data architectures, designing and implementing ETL processes, optimizing data systems, collaborating with cross-functional teams to understand data requirements, ensuring data quality, integrity, and security, automating repetitive data tasks, monitoring and troubleshooting production data pipelines, and documenting systems, processes, and best practices. To excel in this role, you should possess a Bachelor's/Master's degree in Computer Science, Information Technology, or a related field, along with at least 2 years of experience as a Data Engineer or in a similar role. Proficiency in SQL, Python, or Scala is essential, as well as experience with data pipeline tools like Apache Airflow and familiarity with big data tools such as Hadoop and Spark. Hands-on experience with cloud platforms like AWS, GCP, or Azure is preferred, along with knowledge of data warehouse solutions like Snowflake, Redshift, or BigQuery. Preferred qualifications include knowledge of CI/CD for data applications, experience with containerization tools like Docker and Kubernetes, and exposure to data governance and compliance standards. If you are ready to be part of a data-driven transformation journey, apply now to join our team at Synram Software Pvt Ltd. For inquiries, contact us at career@synram.co or +91-9111381555. Benefits of this full-time, permanent role include a flexible schedule, internet reimbursement, leave encashment, day shift with fixed hours and weekend availability, joining bonus, and performance bonus. The ability to commute/relocate to Gwalior, Madhya Pradesh, is preferred. Don't miss the opportunity to contribute your expertise to our dynamic team. The application deadline is 20/07/2025, and the expected start date is 12/07/2025. We look forward to welcoming you aboard for a rewarding and challenging career in data engineering.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

Are you eager to thrive in one of the most competitive online marketing environments for one of the world's leading e-commerce companies As the Marketing Analytics Manager at Priceline, you'll be responsible for driving our best deals to customers by managing one or more key third-party distribution channels (Google, Bing, Kayak, TripAdvisor, or Trivago). You'll take ownership of channel strategy, develop innovative ideas, and execute them to completion. This role is eligible for our hybrid work model: Two days in-office. Given the critical role Meta Search channels play in our business, this is a high-visibility position, with regular exposure to senior management. You'll have the opportunity to work with cutting-edge tools and methodologies in operations management, machine learning, and data analytics. What sets Priceline apart is the freedom to take a great idea and run with it. You'll have the chance to collaborate across teams including tech, product, and leadership to bring your ideas to life. We cultivate an agile, dynamic environment where anyone can drive meaningful change, as long as you bring the vision, energy, and determination to succeed. In this role you will get to: - Strategy: Define and implement growth strategies by leveraging deep data analysis and building strong relationships with channel partners to drive increased bookings and profitability. - Project Execution: Lead and deliver strategic improvements by collaborating with product, data, and marketing teams. - Experimentation: Design, implement, and analyze A/B testing to optimize performance and deploy channel improvements. - Communication: Present your vision and outcomes to key stakeholders, including senior marketing leaders and C-suite executives. Who you are: - Bachelor's degree or higher in a quantitative field (e.g., Mathematics, Economics) - 5+ years of experience in data analytics, demonstrating strong statistical analysis and numerical reasoning abilities. - 2+ years of experience in digital marketing - Excellent interpersonal skills with the ability to simplify complex concepts and effectively communicate with senior management - A proactive, impact-driven mindset focused on customer success - Strong statistical knowledge and analytical skills: Ability to apply statistical methods and techniques to analyze complex data sets, interpret results, and inform decision-making - Positive attitude towards change and adaptability in a fast-paced environment - A collaborative, team-player mentality with a willingness to take ownership of channel performance - Fluent in English (both verbal and written) Must have experience with both: Technical: R or Python, SQL, BigQuery, Oracle, Tableau, Excel Non-Technical: Ownership of Business Performance, Proactive Decision Making, Strategy Forming, and Problem Solving, Management of Partners & Relationships Illustrated history of living the values necessary to Priceline: Customer, Innovation, Team, Accountability, and Trust. The Right Results, the Right Way is not just a motto at Priceline; it's a way of life. Unquestionable integrity and ethics are essential. #LI-AR1 If you want to be part of something truly special, check us out!,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

The role of Senior Java Developer with Big Data based in Gurugram (onsite) is a full-time position that requires a highly skilled individual with expertise in Java development, particularly in Spring Boot and SQL. The primary responsibility of the ideal candidate will be to design, develop, and maintain robust backend systems. Additionally, experience with cloud platforms and big data technologies would be advantageous for this role. As a Senior Java Developer, you will be tasked with designing, developing, and maintaining backend services using Java and Spring Boot. Your role will involve writing efficient SQL queries and collaborating with cross-functional teams to deliver new features. Ensuring code quality through unit testing, code reviews, and troubleshooting production issues are also key aspects of this position. It is essential to document technical designs and processes for effective communication within the team. The required skills for this role include strong experience in Java development (version 8 or above), a solid understanding of Spring Boot, and proficiency in SQL and relational databases such as PostgreSQL, MySQL, or Oracle. Familiarity with RESTful API design and implementation is also necessary. Nice to have skills for this position include experience with cloud platforms like Google Cloud Platform (GCP), AWS, or Azure, exposure to Big Data technologies such as Hadoop, Spark, or BigQuery, familiarity with Adobe Workfront and Adobe Personalization Products, and an understanding of CI/CD pipelines and containerization using tools like Docker and Kubernetes. To qualify for this role, candidates should possess a Bachelor's degree in Computer Science, Engineering, or a related field along with at least 5 years of relevant development experience.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a technology services and solutions provider specializing in Data, AI, and Digital, Zimetrics is dedicated to assisting enterprises in harnessing the economic potential and business value of data from various sources. Our core principles of Integrity, Intellect, and Ingenuity influence our value system, engineering expertise, and organizational behavior, making us problem solvers and innovators who challenge conventional wisdom and believe in endless possibilities. You will be responsible for designing scalable and secure cloud-based data architecture solutions, leading data modeling, integration, and migration strategies across platforms, and engaging directly with clients to comprehend business needs and translate them into technical solutions. Additionally, you will support sales and pre-sales teams with solution architecture, technical presentations, and proposals, collaborate with cross-functional teams including engineering, BI, and product, and ensure adherence to best practices in data governance, security, and performance optimization. To excel in this role, you must possess strong experience with Cloud platforms such as AWS, Azure, or GCP, a deep understanding of Data Warehousing concepts and tools like Snowflake, Redshift, and BigQuery, proven expertise in data modeling encompassing conceptual, logical, and physical aspects, excellent communication and client engagement skills, and experience in pre-sales or solution consulting, which is considered a strong advantage. Furthermore, the ability to articulate complex technical concepts to non-technical stakeholders will be vital for success in this position.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Engineer, you will be responsible for designing and developing robust ETL pipelines using Python, PySpark, and Google Cloud Platform (GCP) services. Your role will involve building and optimizing data models and queries in BigQuery for analytics and reporting purposes. You will also be responsible for ingesting, transforming, and loading structured and semi-structured data from various sources. Collaboration with data analysts, scientists, and business teams to comprehend data requirements will be a key aspect of your job. Ensuring data quality, integrity, and security across cloud-based data platforms is crucial. Monitoring and troubleshooting data workflows and performance issues will also be part of your responsibilities. Automation of data validation and transformation processes using scripting and orchestration tools will be an essential aspect of your role. You are required to have hands-on experience with Google Cloud Platform (GCP), especially BigQuery. Strong programming skills in Python and/or PySpark are necessary for this position. Your experience in designing and implementing ETL workflows and data pipelines will be valuable. Proficiency in SQL and data modeling for analytics is required. Familiarity with GCP services such as Cloud Storage, Dataflow, Pub/Sub, and Composer is preferred. Understanding data governance, security, and compliance in cloud environments is essential. Experience with version control tools like Git and agile development practices will be beneficial for this role. If you are looking for a challenging opportunity to work on cutting-edge data engineering projects, this position is ideal for you.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As an Informatica IDMC Developer at Coforge, your primary responsibility will be to design, develop, and maintain robust ETL pipelines using Informatica Intelligent Data Management Cloud (IDMC/IICS). You will collaborate with data architects, analysts, and business stakeholders to gather and comprehend data requirements. Your role will involve integrating data from various sources including databases, APIs, and flat files, and optimizing data workflows for enhanced performance, scalability, and reliability. Monitoring and troubleshooting ETL jobs to address data quality issues will be a part of your daily tasks. Implementing data governance and security best practices will also be crucial, along with maintaining detailed documentation of data flows, transformations, and architecture. Your contribution to code reviews and continuous improvement initiatives will be valued. The ideal candidate for this position should possess strong hands-on experience with Informatica IDMC (IICS) and cloud-based ETL tools. Proficiency in SQL and prior experience working with relational databases like Oracle, SQL Server, and PostgreSQL is essential. Additionally, familiarity with cloud platforms such as AWS, Azure, or GCP, and knowledge of data warehousing concepts and tools like Snowflake, Redshift, or BigQuery are required. Excellent problem-solving skills and effective communication abilities are highly desirable qualities for this role. Preferred qualifications for this position include experience with CI/CD pipelines and version control systems, as well as knowledge of data modeling and metadata management. Holding certifications in Informatica or cloud platforms will be considered a plus. If you have 5-8 years of relevant experience and possess the mentioned skill set, we encourage you to apply for this position by sending your CV to Gaurav.2.Kumar@coforge.com. This position is based in Greater Noida.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Full-Stack AI App Developer at EMO Energy, you will play a key role in reimagining urban mobility, energy, and fleet operations through our AI-driven super app. You will have the opportunity to take full ownership of building and deploying a cutting-edge energy infrastructure startup in India. Your responsibilities will include architecting and developing a full-stack AI-enabled application, designing modular frontend views using React.js or React Native, creating intelligent agent interfaces, building secure backend APIs for managing energy and fleet operations, integrating real-time data workflows, implementing fleet tracking dashboards, and optimizing performance across various platforms. Collaboration with the founding team, ops team, and hardware teams will be essential to iterate fast and solve real-world logistics problems. The ideal candidate for this role should have a strong command of front-end frameworks such as React.js, experience with back-end technologies like FastAPI, Node.js, or Django, proficiency in TypeScript or Python, familiarity with GCP services, Docker, GitHub Actions, and experience with mobile integrations and AI APIs. End-to-end ownership of previous applications, strong UI/UX product sensibility, and experience in building dashboards or internal tools will be valuable assets. Additionally, the ability to adapt to ambiguity, communicate technical decisions to non-engineers, and a passion for clean code and impactful work are crucial for success in this role. If you are a highly motivated individual with a passion for AI-driven applications and a desire to lead the development of a cutting-edge fleet/energy platform, then this role at EMO Energy is the perfect opportunity for you. Join us in revolutionizing the future of urban mobility and energy infrastructure in India.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

As a Data Governance and Management Developer at Assent, you will play a crucial role in ensuring the quality and reliability of critical data across systems and domains. Your responsibilities will include defining and implementing data quality standards, developing monitoring pipelines to detect data issues, conducting data profiling assessments, and designing data quality dashboards. You will collaborate with cross-functional teams to resolve data anomalies and drive continuous improvement in data quality. Key Requirements & Responsibilities: - Define and implement data quality rules, validation checks, and metrics for critical business domains. - Develop Data Quality (DQ) monitoring pipelines and alerts to proactively detect data issues. - Conduct regular data profiling and quality assessments to identify gaps, inconsistencies, duplicates, and anomalies. - Design and maintain data quality dashboards and reports for visibility into trends and issues. - Utilize generative AI to automate workflows, enhance data quality, and support responsible prompt usage. - Collaborate with data owners, stewards, and technical teams to resolve data quality issues. - Develop and document standard operating procedures (SOPs) for issue management and escalation workflows. - Support root cause analysis (RCA) for recurring or high-impact data quality problems. - Define and monitor key data quality KPIs and drive continuous improvement through insights and analysis. - Evaluate and recommend data quality tools that scale with the enterprise. - Provide recommendations for enhancing data processes, governance practices, and quality standards. - Ensure compliance with internal data governance policies, privacy standards, and audit requirements. - Adhere to corporate security policies and procedures set by Assent. Qualifications: - 2-5 years of experience in a data quality, data analyst, or similar role. - Degree in Computer Science, Information Systems, Data Science, or related field. - Strong understanding of data quality principles. - Proficiency in SQL, Git Hub, R, Python, SQL Server, and BI tools like Tableau, Power BI, or Sigma. - Experience with cloud data platforms (e.g., Snowflake, BigQuery) and data transformation tools (e.g., dbt). - Exposure to Graph databases and GenAI tools. - Ability to interpret dashboards and communicate data quality findings effectively. - Understanding of data governance frameworks and regulatory considerations. - Strong problem-solving skills, attention to detail, and familiarity with agile work environments. - Excellent verbal and written communication skills. Join Assent and be part of a dynamic team that values wellness, financial benefits, lifelong learning, and diversity, equity, and inclusion. Make a difference in supply chain sustainability and contribute to meaningful work that impacts the world. Contact talent@assent.com for assistance or accommodation during the interview process.,

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

karnataka

On-site

You should have 6 months to 3 years of IT experience. You must have knowledge of Bigquery, SQL, or similar tools. It is essential to be aware of ETL and Data warehouse concepts. Your oral and written communication skills should be good. Being a great team player and able to work efficiently with minimal supervision is crucial. You should also have good knowledge of Java or Python to conduct data cleansing. Preferred qualifications include good communication and problem-solving skills. Experience on Spring Boot would be an added advantage. Being an Apache Beam developer with Google Cloud BigTable and Google BigQuery is desirable. Experience in Google Cloud Platform (GCP) is preferred. Skills in writing batch and stream processing jobs using Apache Beam Framework (Dataflow) are a plus. Knowledge of Microservices, Pub/Sub, Cloud Run, and Cloud Function would be beneficial.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies