Jobs
Interviews

1356 Bigquery Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

5 - 14 Lacs

Pune, Chennai, Bengaluru

Work from Office

Dear Candidate, This is with reference to your profile on the job portal. Deloitte India Consulting has an immediate requirement for the following role. Job Summary: We are looking for a skilled GCP Data Engineer to design, build, and maintain scalable data pipelines and solutions on Google Cloud Platform . The ideal candidate will have hands-on experience with GCP services, data warehousing, ETL processes, and big data technologies. Key Responsibilities: Design and implement scalable data pipelines using Cloud Dataflow , Apache Beam , and Cloud Composer . Develop and maintain data models and data marts in BigQuery . Build ETL/ELT workflows to ingest, transform, and load data from various sources. Optimize data storage and query performance in BigQuery and other GCP services. Collaborate with data scientists, analysts, and business stakeholders to understand data requirements. Ensure data quality, integrity, and security across all data solutions. Monitor and troubleshoot data pipeline issues and implement improvements. Required Skills & Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or related field. 3+ years of experience in data engineering, with at least 1–2 years on Google Cloud Platform . Proficiency in SQL , Python , and Apache Beam . Hands-on experience with GCP services like BigQuery , Cloud Storage , Cloud Pub/Sub , Cloud Dataflow , and Cloud Composer . Experience with data modeling , data warehousing , and ETL/ELT processes. Familiarity with CI/CD pipelines , Terraform , and Git . Strong problem-solving and communication skills. Nice to Have: GCP certifications (e.g., Professional Data Engineer ). Incase if you are interested, please share your updated resume along with the following details.(Mandatory) To smouni@deloitte.com Candidate Name Mobile No. Email ID Skill Total Experience Education Details Current Location Requested location Current Firm Current CTC Exp CTC Notice Period/LWD Feedback

Posted 1 week ago

Apply

3.0 - 5.0 years

0 - 3 Lacs

Hyderabad, Pune, Chennai

Work from Office

Role & responsibilities 35 years of experience in digital/web analytics , preferably in high-traffic environments. Proficiency with BigQuery and advanced SQL for data querying and transformation. Hands-on experience with Google Cloud Platform (GCP) tools and services. Strong understanding of digital metrics, user journeys, attribution models, and funnel analysis. Ability to collaborate with cross-functional stakeholders and translate data into business recommendations.

Posted 1 week ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Gurugram

Work from Office

The Marketing Operations Analyst II is responsible for lead and pipeline reporting, ensuring accurate data tracking, performance measurement, and insights to drive marketing and sales alignment. This role focuses on lead management, pipeline analysis, and revenue reporting, while also managing Google Analytics, Google Ads, and other digital marketing performance metrics. The ideal candidate has strong analytical skills, experience with BigQuery, and expertise in data visualization, attribution modeling, and marketing analytics. Key Responsibilities: Lead & Pipeline Reporting Manage and maintain lead flow tracking and reporting from marketing campaigns into HubSpot and CRM (Salesforce preferred), ensuring data accuracy and integrity. Track, analyze, and report on pipeline performance, lead conversion rates, and marketing-influenced revenue to measure campaign impact. Build and maintain dashboards in CRM, Looker Studio, Tableau, and Power BI to visualize lead lifecycle stages, funnel progression, and sales handoff efficiency. Develop and refine marketing attribution models to assess the effectiveness of marketing channels and campaigns in pipeline generation. Work closely with Sales Operations and Demand Generation teams to ensure seamless lead handoff and alignment on pipeline metrics. Google & Digital Marketing Reporting Monitor and analyze Google Analytics and Google Ads performance, identifying trends and opportunities for optimization. Track digital campaign performance across paid media, SEO, and website engagement, providing recommendations for improvement. Implement and maintain Google Tag Manager configurations to optimize conversion tracking and event measurement. Conduct A/B testing and conversion rate analysis to optimize digital campaigns and landing page effectiveness. Ensure marketing data is structured correctly in BigQuery for advanced analytics and reporting. Marketing Data Management & Compliance Maintain data hygiene, segmentation, and enrichment to improve targeting and reporting accuracy. Ensure compliance with data privacy regulations (GDPR, CCPA) in all lead tracking and reporting activities. Collaborate with marketing and IT teams to improve data governance and system integrations. Qualifications & Experience 5+ years of experience in marketing analytics, lead operations, or data analysis. Strong expertise in lead management, pipeline tracking, and revenue reporting. Proficiency in BigQuery for querying and analyzing large-scale marketing and sales data. Experience with data visualization tools (Google Looker Studio, Tableau, Power BI). Google Analytics and Google Ads experience, with a strong understanding of paid search, SEO, and digital reporting. Knowledge of marketing attribution modeling, multi-touch attribution, and conversion tracking. Familiarity with CRM systems (Salesforce preferred) and HubSpot reporting. Ability to interpret complex data sets and present insights to marketing, sales, and leadership teams. Preferred Skills: Experience with SQL, Python, or R for advanced data analysis and reporting. Familiarity with Google Tag Manager for event tracking and campaign measurement. Certifications in HubSpot, Google Analytics, or BigQuery are a plus.

Posted 1 week ago

Apply

12.0 - 20.0 years

25 - 40 Lacs

Kolkata, Hyderabad, Pune

Work from Office

GCP Data Architect

Posted 1 week ago

Apply

5.0 - 10.0 years

12 - 22 Lacs

Kolkata, Hyderabad, Pune

Work from Office

GCP Engineer, Lead GCP Engineer

Posted 1 week ago

Apply

3.0 - 8.0 years

12 - 24 Lacs

Hyderabad

Work from Office

SQL/PL SQL, Teradata or BigQuery (mandatory), dynamic SQL, cloud database migration, SQL code conversion, debugging, production deployment, scheduler handling (Composer DAGs/Autosys), issue resolution, and end-to-end migration support.

Posted 1 week ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Hybrid

Overview of 66degrees 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With our unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing the challenge and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way. Overview of Role We are looking for a highly motivated and experienced Data Analytics Engineer to join our team. As a Data Analytics Engineer, you will be responsible for building and implementing robust data analytics solutions using Microsoft Power BI. You will collaborate with cross-functional teams to gather requirements, design scalable architectures, and deliver high-quality solutions that meet business needs. A key aspect of this role involves ensuring data literacy within the organization, empowering users to understand and validate data effectively. Responsibilities Work with Clients to enable them on the Power BI Platform, teaching them how to construct an analytics ecosystem from the ground up. Advise clients on how to develop their analytics centers of excellence, defining and designing processes to promote a scalable, governed analytics ecosystem. Utilize Microsoft Power BI to design and develop interactive and visually appealing dashboards and reports for end-users. Write clean, efficient, and scalable DAX and M code (Power Query). Conduct performance tuning and optimization of data analytics solutions to ensure efficient processing and query performance. Stay up to date with the latest trends and best practices in cloud data analytics, big data technologies, and data visualization tools. Collaborate with other teams to ensure seamless integration of data analytics solutions with existing systems and processes. Provide technical guidance and mentorship to junior team members, sharing knowledge and promoting best practices. Promote data literacy and data validation best practices across teams. Qualifications 5+ years in Power BI for designing interactive dashboards and reports. Previous experience with the Looker Platform is a plus. Experience working with Google BigQuery and the Google Cloud Platform is a plus. Comprehension of Power BI's security capabilities and supporting multiple personas in the platform. Strong problem-solving skills and the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills with the ability to work effectively in a cross-functional team environment. Demonstrated experience in promoting data literacy and enabling users to understand and validate data. An understanding of conversational analytics and experience working with PowerBI Copilot is a plus. Microsoft Power BI certifications are a plus. Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. 66degrees is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to actual or perceived race, color, religion, sex, gender, gender identity, national origin, age, weight, height, marital status, sexual orientation, veteran status, disability status or other legally protected class.

Posted 1 week ago

Apply

12.0 - 17.0 years

27 - 35 Lacs

Madurai, Chennai

Hybrid

Dear Candidate, Greetings of the day!! I am Kantha, and I'm reaching out to you regarding an exciting opportunity with TechMango. You can connect with me on LinkedIn https://www.linkedin.com/in/kantha-m-ashwin-186ba3244/ Or Email: kanthasanmugam.m@techmango.net Techmango Technology Services is a full-scale software development services company founded in 2014 with a strong focus on emerging technologies. It holds a primary objective of delivering strategic solutions towards the goal of its business partners in terms of technology. We are a full-scale leading Software and Mobile App Development Company. Techmango is driven by the mantra Clients Vision is our Mission. We have a tendency to stick on to the current statement. To be the technologically advanced & most loved organization providing prime quality and cost-efficient services with a long-term client relationship strategy. We are operational in the USA - Chicago, Atlanta, Dubai - UAE, in India - Bangalore, Chennai, Madurai, Trichy. Job Title: Technical GCP Data Architect/Lead Location: Madurai Experience: 12+ Years Notice Period: Immediate Job Summary We are seeking a hands-on Technical GCP Data Architect/Lead with deep expertise in real-time streaming data architectures to help design, build, and optimize data pipelines in our Google Cloud Platform (GCP) environment. The ideal candidate will have strong architectural vision and be comfortable rolling up their sleeves to build scalable, low-latency streaming data pipelines using Pub/Sub, Dataflow (Apache Beam) , and BigQuery . Key Responsibilities Architect and implement end-to-end streaming data solutions on GCP using Pub/Sub , Dataflow , and BigQuery . Design real-time ingestion, enrichment, and transformation pipelines for high-volume event data. Work closely with stakeholders to understand data requirements and translate them into scalable designs. Optimize streaming pipeline performance, latency, and throughput. Build and manage orchestration workflows using Cloud Composer (Airflow) . Drive schema design, partitioning, and clustering strategies in BigQuery for both real-time and batch datasets. Define SLAs, monitoring, logging, and alerting for streaming jobs using Cloud Monitoring , Error Reporting , and Stackdriver . Experience with the data modeling. Ensure robust security, encryption, and access controls across all data layers. Collaborate with DevOps for CI/CD automation of data workflows using Terraform , Cloud Build , and Git . Document streaming architecture, data lineage, and deployment runbooks. Required Skills & Experience 10+ years of experience in data engineering or architecture. 3+ years of hands-on GCP data engineering experience. Strong expertise in: Google Pub/Sub Dataflow (Apache Beam) BigQuery (including streaming inserts) Cloud Composer (Airflow) Cloud Storage (GCS) Solid understanding of streaming design patterns , exactly-once delivery , and event-driven architecture . Deep knowledge of SQL and NoSQL data modeling. Hands-on experience with monitoring and performance tuning of streaming jobs. Experience using Terraform or equivalent for infrastructure as code. Familiarity with CI/CD pipelines for data workflows.

Posted 1 week ago

Apply

4.0 - 8.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Salary: 20 to 35 LPA Exp: 3 to 7 years Location: Gurgaon/Pune/Bengalore Notice: Immediate to 30 days..!! Job Profile: Experienced Data Engineer with a strong foundation in designing, building, and maintaining scalable data pipelines and architectures. Skilled in transforming raw data into clean, structured formats for analytics and business intelligence. Proficient in modern data tools and technologies such as SQL, T-SQL, Python, Databricks, and cloud platforms (Azure). Adept at data wrangling, modeling, ETL/ELT development, and ensuring data quality, integrity, and security. Collaborative team player with a track record of enabling data-driven decision-making across business units. As a Data engineer, Candidate will work on the assignments for one of our Utilities clients. Collaborating with cross-functional teams and stakeholders involves gathering data requirements, aligning business goals, and translating them into scalable data solutions. The role includes working closely with data analysts, scientists, and business users to understand needs, designing robust data pipelines, and ensuring data is accessible, reliable, and well-documented. Regular communication, iterative feedback, and joint problem-solving are key to delivering high-impact, data-driven outcomes that support organizational objectives. This position requires a proven track record of transforming processes, driving customer value, cost savings with experience in running end-to-end analytics for large-scale organizations. Design, build, and maintain scalable data pipelines to support analytics, reporting, and advanced modeling needs. Collaborate with consultants, analysts, and clients to understand data requirements and translate them into effective data solutions. Ensure data accuracy, quality, and integrity through validation, cleansing, and transformation processes. Develop and optimize data models, ETL workflows, and database architectures across cloud and on-premises environments. Support data-driven decision-making by delivering reliable, well-structured datasets and enabling self-service analytics. Provides seamless integration with cloud platforms (Azure), making it easy to build and deploy end-to-end data pipelines in the cloud Scalable clusters for handling large datasets and complex computations in Databricks, optimizing performance and cost management. Must to have Client Engagement Experience and collaboration with cross-functional teams Data Engineering background in Databricks Capable of working effectively as an individual contributor or in collaborative team environments Effective communication and thought leadership with proven record. Candidate Profile: Bachelors/masters degree in economics, mathematics, computer science/engineering, operations research or related analytics areas 3+ years experience must be in Data engineering. Hands on experience on SQL, Python, Databricks, cloud Platform like Azure etc. Prior experience in managing and delivering end to end projects Outstanding written and verbal communication skills Able to work in fast pace continuously evolving environment and ready to take up uphill challenges Is able to understand cross cultural differences and can work with clients across the globe.

Posted 1 week ago

Apply

12.0 - 20.0 years

35 - 60 Lacs

Bengaluru

Work from Office

Who We Are At Kyndryl, we design, build, manage and modernize the mission-critical technology systems that the world depends on every day. So why work at Kyndryl? We are always moving forward – always pushing ourselves to go further in our efforts to build a more equitable, inclusive world for our employees, our customers and our communities. The Role Join the innovative team at Kyndryl as a Client Technical Solutioner and unlock your potential to shape the future of technology solutions. As a key player in our organization, you will embark on an exciting journey where you get to work closely with customers, understand their unique challenges, and provide them with cutting-edge technical solutions and services. Picture yourself as a trusted advisor – collaborating directly with customers to unravel their business needs, pain points, and technical requirements. Your expertise and deep understanding of our solutions will empower you to craft tailored solutions that address their specific challenges and drive their success. Your role as a Client Technical Solutioner is pivotal in developing domain-specific solutions for our cutting-edge services and offerings. You will be at the forefront of crafting tailored domain solutions and cost cases for both simple and complex, long-term opportunities, demonstrating we meet our customers' requirements while helping them overcome their business challenges. At Kyndryl, we believe in the power of collaboration and your expertise will be essential in supporting our Technical Solutioning and Solutioning Managers during customer technology and business discussions, even at the highest levels of Business/IT Director/LOB. You will have the chance to demonstrate the value of our solutions and products, effectively communicating their business and technical benefits to decision makers and customers. In this role, you will thrive as you create innovative technical solutions that align with industry trends and exceed customer expectations. Your ability to collaborate seamlessly with internal stakeholders will enable you to gather the necessary documents and technical insights to deliver compelling bid submissions. Not only will you define winning cost models for deals, but you will also lead these deals to profitability, ensuring the ultimate success of both our customers and Kyndryl. You will play an essential role in contract negotiations, up to the point of signature, and facilitate a smooth engagement hand-over process. As the primary source of engagement management and solution design within your technical domain, you will compile, refine, and take ownership of final solution documents. Your technical expertise will shine through as you present these documents in a professional and concise manner, showcasing your mastery of the subject matter. You’ll have the opportunity to contribute to the growth and success of Kyndryl by standardizing our go-to-market pitches across various industries. By creating differentiated propositions that align with market requirements, you will position Kyndryl as a leader in the industry, opening new avenues of success for our customers and our organization. Join us as a Client Technical Solutioner at Kyndryl and unleash your potential to shape the future of technical solutions while enjoying a stimulating and rewarding career journey filled with innovation, collaboration, and growth. Your Future at Kyndryl Every position at Kyndryl offers a way forward to grow your career. We have opportunities that you won’t find anywhere else, including hands-on experience, learning opportunities, and the chance to certify in all four major platforms. Whether you want to broaden your knowledge base or narrow your scope and specialize in a specific sector, you can find your opportunity here. Who You Are You’re good at what you do and possess the required experience to prove it. However, equally as important – you have a growth mindset; keen to drive your own personal and professional development. You are customer-focused – someone who prioritizes customer success in their work. And finally, you’re open and borderless – naturally inclusive in how you work with others. Required Skills and Experience 10 – 15 Years (Specialist Seller / Consultant) is a must with 3 – 4 years of relevant experience as Data. Hands on experience in the area of Data Platforms (DwH / Datalake) like Cloudera / Databricks / MS Data Fabric / Teradata / Apache Hadoop / BigQuery / AWS Big Data Solutions (EMR, Redshift, Kinesis) / Qlik etc. Proven past experience in modernizing legacy data / app & transforming them to cloud - architectures Strong understanding of data modelling and database design. Expertise in data integration and ETL processes. Knowledge of data warehousing and business intelligence concepts. Experience with data governance and data quality management Good Domain Experience in BFSI or Manufacturing area. Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). Strong understanding of data integration techniques, including ETL (Extract, Transform, Load), Processes, Data Pipelines, and Data Streaming using Python , Kafka for streams , Pyspark , DBT , and ETL services Understanding & Experience in Data Security principles - Data Masking / Encryption etc Knowledge of Data Governance principles and practices, including Data Quality, Data Lineage, Data Privacy, and Compliance. Knowledge of systems development, including system development life cycle, project management approaches and requirements, design, and testing techniques Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams and vendors, independently Deep knowledge of Services offerings and technical solutions in a practice Demonstrated experience translating distinctive technical knowledge into actionable customer insights and solutions Prior consultative selling experience Externally recognized as an expert in the technology and/or solutioning areas, to include technical certifications supporting subdomain focus area(s) Responsible for Prospecting & Qualifying leads, do the relevant Product / Market Research independently, in response to Customer’s requirement / Pain Point. Advising and Shaping Client Requirements to produce high-level designs and technical solutions in response to opportunities and requirements from Customers and Partners. Work with both internal / external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Understand & analyze the application requirements in Client RFPs Design software applications based on the requirements within specified architectural guidelines & constraints. Lead, Design and implement Proof of Concepts & Pilots to demonstrate the solution to Clients /prospects. Being You Diversity is a whole lot more than what we look like or where we come from, it’s how we think and who we are. We welcome people of all cultures, backgrounds, and experiences. But we’re not doing it single-handily: Our Kyndryl Inclusion Networks are only one of many ways we create a workplace where all Kyndryls can find and provide support and advice. This dedication to welcoming everyone into our company means that Kyndryl gives you – and everyone next to you – the ability to bring your whole self to work, individually and collectively, and support the activation of our equitable culture. That’s the Kyndryl Way. What You Can Expect With state-of-the-art resources and Fortune 100 clients, every day is an opportunity to innovate, build new capabilities, new relationships, new processes, and new value. Kyndryl cares about your well-being and prides itself on offering benefits that give you choice, reflect the diversity of our employees and support you and your family through the moments that matter – wherever you are in your life journey. Our employee learning programs give you access to the best learning in the industry to receive certifications, including Microsoft, Google, Amazon, Skillsoft, and many more. Through our company-wide volunteering and giving platform, you can donate, start fundraisers, volunteer, and search over 2 million non-profit organizations. At Kyndryl, we invest heavily in you, we want you to succeed so that together, we will all succeed. Get Referred! If you know someone that works at Kyndryl, when asked ‘How Did You Hear About Us’ during the application process, select ‘Employee Referral’ and enter your contact's Kyndryl email address.

Posted 1 week ago

Apply

10.0 - 12.0 years

9 - 10 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

GCP- Certified, Designing and Architecture, GCP services (Compute Engine, GKE, Cloud SQL, BigQuery, etc.) containers and orchestration Terraform, Cloud Build, Cloud Functions, or other GCP-native tools IAM, VPCs, firewall rules, service accounts, and Cloud Identity Grafana any Monitoring tool Role & responsibilities Preferred candidate profile

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Google Cloud Architect in Pune (Hybrid) with over 10 years of experience, including 3+ years specifically on GCP, you will play a crucial role in leading the design and delivery of comprehensive cloud solutions on Google Cloud Platform. Your responsibilities will involve collaborating with data engineering, DevOps, and architecture teams to create scalable, secure, and cost-effective cloud platforms. Your key responsibilities will include designing scalable data and application architectures utilizing tools such as BigQuery, Dataflow, Composer, Cloud Run, Pub/Sub, and other related GCP services. You will be leading cloud migration, modernization, and CI/CD automation through the use of technologies like Terraform, Jenkins, GitHub, and Cloud Build. Additionally, you will be responsible for implementing real-time and batch data pipelines, chatbot applications using LLMs (Gemini, Claude), and automating reconciliation and monitoring processes. Your role will also involve collaborating closely with stakeholders to ensure technical solutions align with business objectives. The ideal candidate for this role should have a minimum of 3 years of experience working with GCP and possess a strong proficiency in key tools such as BigQuery, Dataflow, Cloud Run, Airflow, GKE, and Cloud Functions. Hands-on experience with Terraform, Kubernetes, Jenkins, GitHub, and cloud-native CI/CD is essential. In addition, you should have a solid understanding of DevSecOps practices, networking, and data architecture concepts like Data Lake, Lakehouse, and Mesh. Proficiency in Python, SQL, and ETL frameworks such as Ab Initio is also required. Preferred qualifications for this role include GCP Certifications (Cloud Architect, DevOps, ML Engineer), experience with Azure or hybrid environments, and domain expertise in sectors like Banking, Telecom, or Retail.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

thiruvananthapuram, kerala

On-site

You will be part of a dynamic team at Equifax, where we are seeking creative, high-energy, and driven software engineers with hands-on development skills to contribute to various significant projects. As a software engineer at Equifax, you will have the opportunity to work with cutting-edge technology alongside a talented group of engineers. This role is perfect for you if you are a forward-thinking, committed, and enthusiastic individual who is passionate about technology. Your responsibilities will include designing, developing, and operating high-scale applications across the entire engineering stack. You will be involved in all aspects of software development, from design and testing to deployment, maintenance, and continuous improvement. By utilizing modern software development practices such as serverless computing, microservices architecture, CI/CD, and infrastructure-as-code, you will contribute to the integration of our systems with existing internal systems and tools. Additionally, you will participate in technology roadmap discussions and architecture planning to translate business requirements and vision into actionable solutions. Working within a closely-knit, globally distributed engineering team, you will be responsible for triaging product or system issues and resolving them efficiently to ensure the smooth operation and quality of our services. Managing project priorities, deadlines, and deliverables will be a key part of your role, along with researching, creating, and enhancing software applications to advance Equifax Solutions. To excel in this position, you should have a Bachelor's degree or equivalent experience, along with at least 7 years of software engineering experience. Proficiency in mainstream Java, SpringBoot, TypeScript/JavaScript, as well as hands-on experience with Cloud technologies such as GCP, AWS, or Azure, is essential. You should also have a solid background in designing and developing cloud-native solutions and microservices using Java, SpringBoot, GCP SDKs, and GKE/Kubernetes. Experience in deploying and releasing software using Jenkins CI/CD pipelines, infrastructure-as-code concepts, Helm Charts, and Terraform constructs is highly valued. Moreover, being a self-starter who can adapt to changing priorities with minimal supervision could set you apart in this role. Additional advantageous skills include designing big data processing solutions, UI development, backend technologies like JAVA/J2EE and SpringBoot, source code control management systems, build tools, working in Agile environments, relational databases, and automated testing. If you are ready to take on this exciting opportunity and contribute to Equifax's innovative projects, apply now and be part of our team of forward-thinking software engineers.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

Apply Digital is a global digital transformation partner specializing in Business Transformation Strategy, Product Design & Development, Commerce, Platform Engineering, Data Intelligence, and Change Management. Our mission is to assist change agents in modernizing their organizations and delivering impactful results to their business and customers. Whether our clients are at the beginning, accelerating, or optimizing stage, we help them integrate composable technology as part of their digital transformation journey. Leveraging our extensive experience in developing intelligent products and utilizing AI tools, we drive value for our clients. Founded in 2016 in Vancouver, Canada, Apply Digital has expanded to nine cities across North America, South America, the UK, and Europe. We are excited to announce the launch of a new office in Delhi NCR, India, as part of our ongoing expansion. At Apply Digital, we advocate for a "One Team" approach, where we operate within a "pod" structure that combines senior leadership, subject matter experts, and cross-functional skill sets. This structure is supported by agile methodologies like scrum and sprint cadences, ensuring seamless collaboration and progress towards desired outcomes. Our team embodies our SHAPE values (smart, humble, active, positive, and excellent) to create a safe, empowered, respectful, and enjoyable work environment where everyone can connect, grow, and make a difference together. Apply Digital is a hybrid-friendly organization with remote work options available. The preferred candidate for this position should be based in or near the Delhi/NCR region of India, with working hours overlapping with the Eastern Standard Timezone (EST). The ideal candidate for this role will be responsible for designing, building, and maintaining scalable data pipelines and architectures to support analytical and operational workloads. Key responsibilities include optimizing ETL/ELT pipelines, integrating data pipelines into cloud-native applications, managing cloud data warehouses, ensuring data governance and security, collaborating with analytics teams, and maintaining data documentation. The candidate should have strong proficiency in English, experience in data engineering, expertise in SQL and Python, familiarity with cloud data platforms, and knowledge of ETL/ELT frameworks and workflow orchestration tools. Apply Digital offers a comprehensive benefits package that includes private healthcare coverage, Provident fund contributions, and a gratuity bonus after five years of service. We prioritize work-life balance with flexible personal time off policies and provide opportunities for skill development through training budgets, certifications, workshops, mentorship, and peer support. Apply Digital is committed to fostering an inclusive workplace where individual differences are celebrated, and equal opportunities are provided to all team members.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an Advanced Modeling Manager in Sales Excellence COE at Accenture, you play a vital role in utilizing your expertise in machine learning algorithms, SQL, R or Python, Advanced Excel, and data visualization tools to generate business insights. Your primary responsibility is to build models and scorecards that aid business leaders in understanding trends and market drivers, ultimately improving processes and boosting sales. Working within the Center of Excellence (COE) Analytics Modeling Analysis team, you collaborate with various functions like Sales, Marketing, and Finance to collect and process data. Your analytical skills are put to use in developing insights to support decision-making, which you communicate effectively to stakeholders. Moreover, you contribute to developing industrialized solutions in coordination with the COE team. To excel in this role, you are expected to possess a Bachelor's degree or equivalent experience, along with at least five years of experience in data modeling and analysis. Your expertise in machine learning algorithms, SQL, R, or Python, along with proficiency in Advanced Excel and data visualization tools like Power Bi, Power Apps, Tableau, QlikView, and Google Data Studio, is crucial. Additionally, project management experience, strong business acumen, and attention to detail are valued traits. Furthermore, a Master's degree in Analytics or a related field, understanding of sales processes and systems, knowledge of Google Cloud Platform (GCP) and BigQuery, and experience in Sales, Marketing, Pricing, Finance, or related fields are considered advantageous. Familiarity with Salesforce Einstein Analytics, optimization techniques and packages such as Pyomo, SciPy, PuLP, Gurobi, CPLEX, and Power Apps is beneficial for this role. Join us at Accenture, where Sales Excellence thrives on empowering individuals to compete, win, and grow by leveraging data-driven insights and advanced modeling techniques to drive success.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

We have a new opportunity for a "CRM Support Product Owner" with our client in India on a 1-year contract. As the CRM Support Product Owner, you will be responsible for managing and supporting live digital assets with a focus on CRM implementation, digital transformation, product management, requirement analysis, data analytics, business process redesign, and project management. To qualify for this role, you should have a Bachelor's degree in computer science, software engineering, or a related field. Additionally, having a product management or ITIL certification would be a plus. You should have at least 6 years of experience in product management or support roles, specifically focusing on live digital assets. As a successful candidate, you must have expertise in CRM Implementation, Digital Transformation, Product Management, Requirement Analysis, Data Analytics, Business Process redesign, and Project Management. You should be a product owner with experience in Project Management and Salesforce Cloud Consultant, having integrated Salesforce solutions with multiple data sources using AGILE Methodologies. Your responsibilities will include collaborating with business stakeholders to gather requirements, customize Salesforce features, and lead Agile teams in the successful implementation of Salesforce CRM. You should be familiar with a range of tools and technologies including Salesforce Sales Cloud, Salesforce Service Cloud, Salesforce Marketing Cloud, Salesforce Data Cloud, Google Cloud Platform, CRM Analytics, MS Office Suite, Data Visualization tools, Planning Tools, Testing/Defect Tracking Tools, Marketing tools, Databases, and more. The ideal candidate will have experience in product management within a large and complex organization, proficiency in agile methodologies and tools, and experience with incident management and resolution processes. If you meet these qualifications and are interested in this opportunity, please send your CV along with your expected salary, notice period, current location, nationality, and visa details to nazreen.muhamed@lancesoft.com.,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

As a Cloud & AI Solution Engineer at Microsoft, you will be part of a dynamic team that is at the forefront of innovation in the realm of databases and analytics. Your role will involve working on cutting-edge projects that leverage the latest technologies to drive meaningful impact for commercial customers. If you are insatiably curious and deeply passionate about tackling complex challenges in the era of AI, this is the perfect opportunity for you. In this role, you will play a pivotal role in helping enterprises unlock the full potential of Microsoft's cloud database and analytics stack. You will collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics. Your responsibilities will include hands-on engagements such as Proof of Concepts, hackathons, and architecture workshops to guide customers through secure, scalable solution design and accelerate database and analytics migration into their deployment workflows. To excel in this position, you should have at least 10+ years of technical pre-sales or technical consulting experience, or a Bachelor's/Master's Degree in Computer Science or related field with 4+ years of technical pre-sales experience. You should be an expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) and Azure Analytics (Fabric, Azure Databricks, Purview), as well as competitors in the data warehouse, data lake, big data, and analytics space. Additionally, you should have experience with cloud and hybrid infrastructure, architecture designs, migrations, and technology management. As a trusted technical advisor, you will guide customers through solution design, influence technical decisions, and help them modernize their data platform to realize the full value of Microsoft's platform. You will drive technical sales, lead hands-on engagements, build trusted relationships with platform leads, and maintain deep expertise in Microsoft's Analytics Portfolio and Azure Databases. By joining our team, you will have the opportunity to accelerate your career growth, develop deep business acumen, and hone your technical skills. You will be part of a collaborative and creative team that thrives on continuous learning and flexible work opportunities. If you are ready to take on this exciting challenge and be part of a team that is shaping the future of cloud Database & Analytics, we invite you to apply and join us on this journey.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Technical Lead with over 8 years of experience in Data Engineering, Analytics, and Python development, including at least 3 years in a Technical Lead / Project Management role, you will play a crucial role in driving data engineering and analytics projects for our clients. Your client-facing skills will be essential in ensuring successful project delivery and effective communication between technical and business stakeholders. Your responsibilities will include designing and implementing secure, scalable data architectures on cloud platforms such as AWS, Azure, or GCP. You will lead the development of cloud-based data engineering solutions covering data ingestion, transformation, and storage while defining best practices for integrating diverse data sources securely. Overseeing security aspects of integrations and ensuring compliance with organizational and regulatory requirements will be part of your role. In addition, you will develop and manage robust ETL/ELT pipelines using Python, SQL, and modern orchestration tools, as well as integrate real-time streaming data using technologies like Apache Kafka, Spark Structured Streaming, or cloud-native services. Collaborating with data scientists to integrate AI models into production pipelines and cloud infrastructure will also be a key aspect of your responsibilities. Furthermore, you will work on advanced data analysis to generate actionable insights for business use cases, design intuitive Tableau dashboards and data visualizations, and define data quality checks and validation frameworks to ensure high-integrity data pipelines. Your expertise in REST API development, backend services, and integrating APIs securely will be crucial in developing and deploying data products and integrations. To excel in this role, you must have deep hands-on experience with cloud platforms, expertise in Python, SQL, Spark, Kafka, and streaming integration, proven ability with data warehousing solutions like BigQuery, Snowflake, and Redshift, and a strong understanding of integration security principles. Proficiency in data visualization with Tableau, REST API development, and AI/ML integration will also be essential. Preferred qualifications include prior experience managing enterprise-scale data engineering projects, familiarity with DevOps practices, and understanding of regulatory compliance requirements for data handling. Your ability to lead technical teams, ensure project delivery, and drive innovation in data engineering and analytics will be key to your success in this role.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

The AIML Architect-Dataflow, BigQuery position is a critical role within our organization, focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. You will combine advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that enhance decision-making processes across various departments. Your responsibilities will include building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in our data workflows. Collaboration with data engineers, data scientists, and application developers is essential to align with business goals and technical vision. You must possess a deep understanding of cloud-native architectures and be enthusiastic about leveraging cutting-edge technologies to drive innovation, efficiency, and insights from extensive datasets. You should have a robust background in data processing and AI/ML methodologies, capable of translating complex technical requirements into scalable solutions that meet the evolving needs of the organization. Key Responsibilities - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms for extracting insights from large datasets. - Optimize data storage and retrieval processes to improve performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Work closely with cross-functional teams to align data workflows with business objectives. - Conduct technical evaluations and assessments of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship and guidance to junior data engineers and analysts. - Stay updated on industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Required Qualifications - Bachelor's or Master's degree in Computer Science, Data Science, or related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, especially BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience in implementing machine learning solutions in cloud environments. - Solid programming skills in Python, Java, or Scala. - Expertise in SQL and other query optimization techniques. - Experience with big data workloads and distributed computing. - Familiarity with modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Proven track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous. Skills - Cloud Computing - SQL Proficiency - Dataflow - AIML - Scala - Data Governance - ETL Processes - Python - Machine Learning - Java - Google Cloud Platform - Data Architecture - Data Modeling - BigQuery - Data Engineering - Data Visualization Tools,

Posted 1 week ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Hyderabad, Pune, Gurugram

Hybrid

Role: Lead Data Engineer Exp: 8-12 Yrs Loc : Pune, Hyd, Bang, Gur(Hybrid) NP : Imme Joiner to 15 Days Skills: Python, Spark, SQL, ETL, Snowflake, Azure, Airflow, BigQuery, AWS(Glue)

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

We are looking for a Java Developer to produce scalable software solutions on distributed systems like Hadoop using Spark Framework. You will be part of a cross-functional team responsible for the full software development life cycle, from conception to deployment. As a Developer, you should be comfortable with back-end coding, development frameworks, third party libraries, and Spark APIs required for application development on distributed platforms like Hadoop. Being a team player with a knack for visual design and utility is essential. Familiarity with Agile methodologies will be an added advantage. A large part of the workloads and applications will be cloud-based, so knowledge and experience with Google Cloud Platform (GCP) will be handy. As part of our flexible scheme, here are some of the benefits you'll enjoy: - Best in class leave policy - Gender-neutral parental leaves - 100% reimbursement under childcare assistance benefit (gender-neutral) - Sponsorship for industry-relevant certifications and education - Employee Assistance Program for you and your family members - Comprehensive Hospitalization Insurance for you and your dependents - Accident and Term life Insurance - Complementary Health screening for 35 years and above Your key responsibilities will include working with development teams and product managers to ideate software solutions, designing client-side and server-side architecture, building features and applications capable of running on distributed platforms and/or the cloud, developing and managing well-functioning applications supporting micro-services architecture, testing software for responsiveness and efficiency, troubleshooting, debugging, and upgrading software, creating security and data protection settings, and writing technical and design documentation. Additionally, you will be responsible for writing effective APIs (REST & SOAP). To be successful in this role, you should have proven experience as a Java Developer or similar role as an individual contributor or development lead, familiarity with common stacks, strong knowledge and working experience of Core Java, Spring Boot, Rest APIs, and Spark API, knowledge of React framework and UI experience, knowledge of Junit, Mockito, or other frameworks, familiarity with GCP services, design/architecture, and security frameworks, experience with databases (e.g., Oracle, PostgreSQL, BigQuery), familiarity with developing on distributed application platforms like Hadoop with Spark, excellent communication and teamwork skills, organizational skills, an analytical mind, a degree in Computer Science, Statistics, or a relevant field, and experience working in Agile environments. Good to have skills include knowledge of JavaScript frameworks (e.g., Angular, React, Node.js) and UI/UX design, knowledge of Python, and knowledge of NoSQL databases like HBASE, MONGO. You should have 4-7 years of prior working experience in a global banking/insurance/financial organization. You will receive training and development to help you excel in your career, coaching and support from experts in your team, and a culture of continuous learning to aid progression. We strive for a culture in which we are empowered to excel together every day, acting responsibly, thinking commercially, taking initiative, and working collaboratively. Together we share and celebrate the successes of our people. We welcome applications from all people and promote a positive, fair, and inclusive work environment.,

Posted 1 week ago

Apply

0.0 - 1.0 years

0 Lacs

Noida

Work from Office

We are excited to invite fresh BTech graduates to our Walk-In Drive for Trainee Roles at our Noida office. This is a great opportunity for recent graduates to kickstart their careers in one of the following domains: Available Domains: Python Java Frontend Development DevOps Software Testing Data Warehouse Walk-In Dates: Wednesday, July 23, 2025 Thursday, July 24, 2025 Important: Only 20 walk-in candidates will be shortlisted. Eligibility Criteria: BTech degree completed (20222025 pass-outs) Basic knowledge in at least one of the mentioned domains Good communication skills Eagerness to learn and grow in the tech field How to Apply: Interested candidates must register using the form below. Only shortlisted candidates will be contacted with interview location details. Apply Here: https://forms.gle/a9LesdmF7g1MM2PW7 Stipend/CTC: As per industry standards (To be discussed during the interview)

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 27 Lacs

Bengaluru

Hybrid

Job Description: We are seeking an experienced and visionary Senior Data Architect to lead the design and implementation of scalable enterprise data solutions. This is a strategic leadership role for someone who thrives in cloud-first, data-driven environments and is passionate about building future-ready data architectures. Key Responsibilities: Define and implement enterprise-wide data architecture strategy aligned with business goals. Design and lead scalable, secure, and resilient data platforms for both structured and unstructured data. Architect data lake/warehouse ecosystems and cloud-native solutions (Snowflake, Databricks, Redshift, BigQuery). Collaborate with business and tech stakeholders to capture data requirements and translate them into scalable designs. Mentor data engineers, analysts, and other architects in data best practices. Establish standards for data modeling, integration, and management. Drive governance across data quality, security, metadata, and compliance. Lead modernization and cloud migration efforts. Evaluate new technologies and recommend adoption strategies. Support data cataloging, lineage, and MDM initiatives. Ensure compliance with privacy standards (e.g., GDPR, HIPAA, CCPA). Required Qualifications: Bachelors/Master’s degree in Computer Science, Data Science, or related field. 10+ years of experience in data architecture; 3+ years in a senior/lead capacity. Hands-on experience with modern cloud data platforms: Snowflake, Azure Synapse, AWS Redshift, BigQuery, etc. Strong skills in data modeling tools (e.g., Erwin, ER/Studio). Deep understanding of ETL/ELT , APIs, and data integration. Expertise in SQL, Python , and data-centric languages. Experience with data governance, RBAC, encryption , and compliance frameworks. DevOps/CI-CD experience in data pipelines is a plus. Excellent communication and leadership skills.

Posted 1 week ago

Apply

5.0 - 9.0 years

7 - 17 Lacs

Pune

Work from Office

Job Title: Data AnalystClient: Amazon Employment Type: Full-time (On-site) Payroll: BCT Consulting Pvt Ltd Work Location: Pune (Work from Office Monday to Friday, General Shift) Experience Required: 5+ Years Joining Mode: Permanent with BCT Consulting Pvt Ltd, deployed at Amazon Role Summary We are seeking a results-driven Data Analyst with a minimum of 5 years of experience in data visualization, reporting, and business intelligence to support strategic decisions at Amazon. The ideal candidate will have hands-on expertise in tools like Amazon Quick Sight with SQL , Power BI with SQL , or Tableau with SQL , along with experience in cloud-based solutions using AWS . Key Skills & Technologies Amazon QuickSight / Tableau / Power BI Strong SQL skills AWS (S3, Redshift, Athena, etc.) Data visualization and storytelling Data modeling and ETL processes Analytical thinking and problem-solving ability Required Qualifications Bachelor’s degree in Computer Science, Information Technology, Statistics, or a related field. Minimum 5 years of hands-on experience in data analysis and visualization. Excellent communication and stakeholder management skills.

Posted 1 week ago

Apply

3.0 - 8.0 years

2 - 5 Lacs

Bharuch

Hybrid

Required Skills : Working knowledge of Big Data / cloud-based / columnar databases (such as Snowflake, AWS Redshift, Bigquery etc.) Experience in scripting languages like Python, R,Spark, Perl, etc. Very good knowledge of ETL concepts Experience in ETL tools like SSIS, Talend, Pentaho, etc., is a plus Very good knowledge of SQL querying Rich experience in relational data modelling Experience in developing logical, physical, and conceptual data models. Experience in AWS & Google cloud Services is a plus Experience with BI dashboard is plus Implementing automated testing platforms and unit tests Proficient understanding of code versioning tools, such as Git Strong analytical skills and problem-solving aptitude. Ability to learn new technologies quickly. Work with other team members in collaborative manner. Passionate to learn & work on versatile technologies Notice Period : Immediate 15 days

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies