Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for leading the delivery of complex solutions by coding larger features from start to finish. Actively participating in planning, performing code and architecture reviews of your team's product will be a crucial aspect of your role. You will help ensure the quality and integrity of the Software Development Life Cycle (SDLC) for your team by identifying opportunities for improvement in how the team works, through the usage of recommended tools and practices. Additionally, you will lead the triage of complex production issues across systems and demonstrate creativity and initiative in solving complex problems. As a high performer, you will consistently deliver a high volume of story points relative to your team. Being aware of the technology landscape, you will plan the delivery of coarse-grained business needs spanning multiple applications. You will also influence technical peers outside your team and set a consistent example of agile development practices. Coaching other engineers to work as a team with Product and UX will be part of your responsibilities. Furthermore, you will create and enhance internal libraries and tools, provide technical leadership on the product, and determine the technical approach. Proactively communicating status and issues to your manager, collaborating with other teams to find creative solutions to customer issues, and showing a commitment to delivery deadlines, especially seasonal and vendor partner deadlines that are critical to Best Buy's continued success, will be essential. Basic Qualifications: - 5+ years of relevant technical professional experience with a bachelor's degree OR equivalent professional experience. - 2+ years of experience with Google Cloud services including Dataflow, Bigquery, Looker. - 1+ years of experience with Adobe Analytics, Content Square, or similar technologies. - Hands-on experience with data engineering and visualization tools like SQL, Airflow, DBT, PowerBI, Tableau, and Looker. - Strong understanding of real-time data processing and issue detection. - Expertise in data architecture, database design, data quality standards/implementation, and data modeling. Preferred Qualifications: - Experience working in an omni-channel retail environment. - Experience connecting technical issues with business performance metrics. - Experience with Forsta or similar customer feedback systems. - Certification in Google Cloud Platform services. - Good understanding of data governance, data privacy laws & regulations, and best practices. About Best Buy: BBY India is a service provider to Best Buy, and as part of the team working on Best Buy projects and initiatives, you will help fulfill Best Buy's purpose to enrich lives through technology. Every day, you will humanize and personalize tech solutions for every stage of life in Best Buy stores, online, and in Best Buy customers" homes. Best Buy is a place where techies can make technology more meaningful in the lives of millions of people, enabling the purpose of enriching lives through technology. The unique culture at Best Buy unleashes the power of its people and provides fast-moving, collaborative, and inclusive experiences that empower employees of all backgrounds to make a difference, learn, and grow every day. Best Buy's culture is built on deeply supporting and valuing its amazing employees and other team members. Best Buy is committed to being a great place to work, where you can unlock unique career possibilities. Above all, Best Buy aims to provide a place where people can bring their full, authentic selves to work now and into the future. Tomorrow works here.,
Posted 22 hours ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
As a Level 9 Industry and Functional AI Decision Science Consultant at Accenture Strategy & Consulting within the Global Network - Data & AI team, your primary responsibility will be to assist clients in the Comms & Media - Telecom practice by designing and implementing AI solutions for their business needs. You will leverage your expertise in the Telco domain, AI fundamentals, and hands-on experience with large datasets to deliver valuable insights and recommendations to key stakeholders. Your role will involve proposing solutions based on comprehensive gap analysis of existing Telco platforms, identifying long-term value propositions, and translating business requirements into functional specifications. By collaborating closely with client stakeholders through interviews and workshops, you will gather essential insights to address their unique challenges and opportunities. In addition to understanding the current processes and potential issues within the Telco environment, you will be responsible for designing future state solutions that leverage Data & AI capabilities effectively. Your ability to analyze complex problems systematically, anticipate obstacles, and establish a clear project roadmap will be crucial to driving successful outcomes for clients. Furthermore, you will act as a strategic partner to clients, aligning their business goals with innovative AI-driven strategies to enhance revenue growth and operational efficiency. Your expertise in storytelling through data analysis will enable you to craft compelling narratives that resonate with senior stakeholders and drive informed decision-making. To excel in this role, you should possess a minimum of 5 years of experience in Data Science, with at least 3 years dedicated to Telecom Analytics. A postgraduate degree from a reputable institution and proficiency in data mining, statistical analysis, and advanced predictive modeling techniques are essential qualifications for this position. Your hands-on experience with various analytical tools and programming languages, such as Python, R, and SQL, will be instrumental in delivering impactful solutions to clients. As a proactive and collaborative team player, you will actively engage with cross-functional teams, mentor junior members, and uphold a high standard of excellence in client interactions. Your strong analytical skills, problem-solving capabilities, and ability to work independently across multiple projects will be key to your success in this dynamic and fast-paced environment. While cloud platform certifications and experience in Computer Vision are considered advantageous, your commitment to continuous learning and staying abreast of industry trends will be highly valued in this role. Overall, your dedication to delivering value-driven AI solutions and fostering long-lasting client relationships will be instrumental in driving success for both Accenture and its clients.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Consultant - Data Engineer at AstraZeneca, you will have the opportunity to contribute to the discovery, development, and commercialization of life-changing medicines by enhancing data platforms built on AWS services. Located at Chennai GITC, you will collaborate with experienced engineers to design and implement efficient data products, supporting data platform initiatives with a focus on impacting patients and saving lives. Your key accountabilities as a Data Engineer will include: Technical Expertise: - Designing, developing, and implementing scalable processes to extract, transform, and load data from various sources into data warehouses. - Demonstrating expert understanding of AstraZeneca's implementation of data products, managing SQL queries and procedures for optimal performance. - Providing support on production issues and enhancements through JIRA. Quality Engineering Standards: - Monitoring and optimizing data pipelines, troubleshooting issues, and maintaining quality standards in design, code, and data models. - Offering detailed analysis and documentation of processes and flows as needed. Collaboration: - Working closely with data engineers to understand data sources, transformations, and dependencies thoroughly. - Collaborating with cross-functional teams to ensure seamless data integration and reliability. Innovation and Process Improvement: - Driving the adoption of new technologies and tools to enhance data engineering processes and efficiency. - Recommending and implementing enhancements to improve reliability, efficiency, and quality of data processing pipelines. To be successful in this role, you should have: - A Bachelor's degree in Computer Science, Information Technology, or a related field. - Strong experience with SQL, warehousing, and building ETL pipelines. - Proficiency in working with column-level databases like Redshift, Cassandra, BigQuery. - Deep SQL knowledge for data extraction, transformation, and reporting. - Excellent communication skills for effective collaboration with technical and non-technical stakeholders. - Strong analytical skills to troubleshoot and deliver solutions in complex data environments. - Experience with Agile Development techniques and methodologies. Desirable skills and experience include knowledge of Databricks/Snowflake, proficiency in scripting and programming languages like Python, experience with reporting tools such as PowerBI, and prior experience in Pharmaceutical or Healthcare industry IT environments. Join AstraZeneca's dynamic team to drive cross-company change and disrupt the industry while making a direct impact on patients through innovative data solutions and technologies. Apply now to be part of our ambitious journey towards becoming a digital and data-led enterprise.,
Posted 1 day ago
6.0 - 10.0 years
0 Lacs
maharashtra
On-site
As a Senior Product Manager, FinTech at Priceline, you will play a crucial role in contributing to the product strategy, development, and execution of Financial Technology products across various product lines such as flights, hotels, rental cars, and packages. Your primary focus will be on collaborating with stakeholders from different departments to understand requirements, create detailed product plans, and ensure the successful delivery and launch of FinTech solutions that bring value to both customers and internal teams. Your responsibilities will include collaborating with teams such as Commercial Teams, Finance, Technology, Accounting, and Financial Planning & Analysis to bring new products to the market. You will be defining product requirements, creating comprehensive product plans, and working closely with engineering teams to develop, test, and launch new solutions. Additionally, you will be expected to stay updated on product trends, emerging technologies, and competitor offerings in the FinTech space to provide valuable insights for product strategy and innovation. In this role, you will be required to have a Bachelor's degree, with an MBA being desirable, along with 6-8 years of consumer-facing internet product management experience. Strong analytical and quantitative skills are essential, as well as familiarity with tools like SQL, BigQuery, Tableau, and ERP systems. An understanding of the travel landscape and financial services industry is preferred, along with experience in reconciliation, accounting, and financial systems implementation. You should be a self-starter with exceptional collaboration and communication skills, capable of engaging and influencing stakeholders at all levels of the organization. Your enthusiasm for strategic planning and daily execution, as well as your ability to work in a fast-paced environment, will be key to succeeding in this role. Additionally, you should align with Priceline's core values of Customer, Innovation, Team, Accountability, and Trust, and uphold unquestionable integrity and ethics in your work. Join Priceline, a dynamic and innovative company that values diversity and inclusion. Be part of a team that is dedicated to making travel affordable and accessible to customers worldwide. If you are ready to contribute to a unique and inspiring culture while working with cutting-edge technologies, Priceline welcomes you to explore this exciting opportunity.,
Posted 1 day ago
7.0 - 11.0 years
0 Lacs
karnataka
On-site
You will be responsible for designing, developing, and optimizing interactive dashboards using Looker and LookML. This includes building LookML models, explores, and derived tables to meet business intelligence needs. You will create efficient data models and queries using BigQuery and collaborate with data engineers, analysts, and business teams to translate requirements into actionable insights. Implementing security and governance policies within Looker to ensure data integrity and controlled access will also be part of your role. Additionally, you will leverage GCP services to build scalable and reliable data solutions and optimize dashboard performance using best practices in aggregation and visualization. Maintaining, auditing, and enhancing existing Looker dashboards, reports, and LookML assets, as well as documenting dashboards, data sources, and processes for scalability and ease of maintenance, are critical tasks. You will also support legacy implementations and facilitate smooth transitions, build new dashboards and visualizations based on evolving business requirements, and work closely with data engineering teams to define and validate data pipelines for timely and accurate data delivery. To qualify for this role, you should have at least 6 years of experience in data visualization and BI, particularly using Looker and LookML. Strong SQL skills with experience optimizing queries for BigQuery are required, along with proficiency in Google Cloud Platform (GCP) and related data services. An in-depth understanding of data modeling, ETL processes, and database structures is essential, as well as familiarity with data governance, security, and role-based access in Looker. Experience with BI lifecycle management, strong communication and collaboration skills, good storytelling and user-centric design abilities, and exposure to the media industry (OTT, DTH, Web) handling large datasets are also necessary. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus, and experience with Python or other scripting languages for automation and data transformation is desirable. Exposure to machine learning or predictive analytics is considered an advantage.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Backend Engineer with 5 - 7 years of experience, you will play a vital role in designing, developing, and optimizing backend services and microservices. Your expertise in Java, Spring Boot, SQL databases, and cloud platforms will be crucial in creating scalable and reliable solutions that drive our applications. Your main responsibilities will include designing and developing Java-based backend services and microservices using Spring Boot. You will collaborate with cross-functional teams to comprehend business requirements and translate them into technical solutions. Writing efficient and maintainable code that adheres to high-quality standards will be essential, along with optimizing existing code for performance enhancement. Managing SQL queries and database schema designs, implementing CI/CD pipelines using Jenkins and BitBucket, and testing and debugging applications using tools like Postman and your preferred IDE will also be part of your daily tasks. To deploy and manage services, you will utilize cloud platforms such as Google Kubernetes Engine (GKE), Spanner, BigQuery, Redis, and MongoDB. Working closely with front-end developers and architects to ensure smooth integration of services and mentoring junior developers on best practices and coding standards will also be part of your role. Collaborating with DevOps teams to guarantee the reliability and scalability of backend services will further enhance the efficiency of our operations. To be successful in this role, you should hold a Bachelor's degree in computer science, engineering, or a related field (Master's degree preferred) and possess a minimum of 5 years of hands-on experience in backend development using Java. Strong expertise in Java, Spring Boot, and microservices architecture is crucial, along with experience in any cloud platform (AWS/GCP/Azure). Proficiency in SQL database design, optimization, and querying is necessary, as well as experience with CI/CD using Jenkins and BitBucket. Familiarity with API testing and debugging tools like Postman, proficiency in using your preferred IDE, and knowledge of cloud platforms such as GKE, Spanner, BigQuery, Redis, and MongoDB will be advantageous. Strong problem-solving skills, attention to detail, excellent communication, and collaboration abilities are essential attributes for this role. The capacity to thrive in a fast-paced and dynamic environment will further contribute to your success in this position.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Technology Service Specialist, AVP at our Pune location, you will be an integral part of the Technology, Data, and Innovation (TDI) Private Bank team. In this role, you will be responsible for providing 2nd Level Application Support for business applications used in branches, by mobile sales, or via the internet. Your expertise in Incident Management and Problem Management will be crucial in ensuring the stability of these applications. Partnerdata, the central client reference data system in Germany, is a core banking system that integrates many banking processes and applications through numerous interfaces. With the recent migration to Google Cloud (GCP), you will be involved in operating and further developing applications and functionalities on the cloud platform. Your focus will also extend to regulatory topics surrounding partner/client relationships. We are seeking individuals who can contribute to this contemporary and emerging Cloud application area. Key Responsibilities: - Ensure optimum service level to supported business lines - Oversee resolution of incidents and problems within the team - Assist in managing business stakeholder relationships - Define and manage OLAs with relevant stakeholders - Monitor team performance, adherence to processes, and alignment with business SLAs - Manage escalations and work with relevant functions to resolve issues quickly - Identify areas for improvement and implement best practices in your area of expertise - Mentor and coach Production Management Analysts within the team - Fulfill Service Requests, communicate with Service Desk function, and participate in major incident calls - Document tasks, incidents, problems, changes, and knowledge bases - Improve monitoring of applications and implement automation of tasks Skills and Experience: - Service Operations Specialist experience in a global operations context - Extensive experience supporting complex application and infrastructure domains - Ability to manage and mentor Service Operations teams - Strong ITIL/best practice service context knowledge - Proficiency in interface technologies, communication protocols, and ITSM tools - Bachelor's Degree in IT or Computer Science related discipline - ITIL certification and experience with ITSM tool ServiceNow preferred - Knowledge of Banking domain and regulatory topics - Experience with databases like BigQuery and understanding of Big Data and GCP technologies - Proficiency in tools like GitHub, Terraform, Cloud SQL, Cloud Storage, Dataproc, Dataflow - Architectural skills for big data solutions and interface architecture Area-Specific Tasks/Responsibilities: - Handle Incident/Problem Management and Service Request Fulfilment - Analyze and resolve incidents escalated from 1st Level Support - Support the resolution of high-impact incidents and escalate when necessary - Provide solutions for open problems and support service transition for new projects/applications Joining our team, you will receive training, development opportunities, coaching from experts, and a culture of continuous learning to support your career progression. We value diversity and promote a positive, fair, and inclusive work environment at Deutsche Bank Group. Visit our company website for more information.,
Posted 1 day ago
15.0 - 22.0 years
0 Lacs
karnataka
On-site
You will be responsible for solution design, client engagement, and delivery oversight, along with senior stakeholder management. Your role will involve leading and driving Google Cloud solutioning for customer requirements, RFPs, proposals, and delivery. You will establish governance frameworks, delivery methodologies, and reusable assets to scale the practice. It is essential to have the ability to take initiative and deliver in challenging engagements spread across multiple geographies. Additionally, you will lead the development of differentiated capabilities and offerings in areas such as application modernization & migration, cloud-native development, and AI agents. Collaboration with sales and pre-sales teams to shape solutions and win strategic deals, including large-scale application modernization and migrations will be a key aspect of your role. You will spearhead Google Cloud's latest products & services like AgentSpace, AI Agent development using GCP-native tools such as ADK, A2A Protocol, and Model Context Protocol. Building and mentoring a high-performing team of cloud architects, engineers, and consultants will also be part of your responsibilities. Driving internal certifications, specialization audits, and partner assessments to maintain Google Cloud Partner status is crucial. Representing the organization in partner forums, webinars, and industry/customer events is also expected from you. To qualify for this role, you should have 15+ years of experience in IT, with at least 3 years in Google Cloud applications architecture, design, and solutioning. Additionally, a minimum of 5 years of experience in designing and developing Java applications/platforms is required. Deep expertise in GCP services including Compute, Storage, BigQuery, Cloud Functions, Anthos, and Vertex AI is essential. Proven experience in leading Google Cloud transformation programs, strong solution architecture, and implementation experience for Google Cloud modernization & migration programs are important qualifications. Strong experience in stakeholder and client management is also necessary. Being Google Cloud certified with the Google Cloud Professional architect certification, self-motivated to quickly learn new technologies and platforms, and possessing excellent presentation and communication skills are crucial for this role. Preferred qualifications include Google Cloud Professional certifications (Cloud Architect), experience with partner ecosystems, co-selling with Google, and managing joint GTM motions, as well as exposure to regulated industries (e.g., BFSI, Healthcare) and global delivery models.,
Posted 1 day ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Data Engineer with 5+ years of experience, you will be responsible for designing, developing, and maintaining scalable data pipelines using Google Cloud Data Proc and Dataflow tools. Your primary focus will be on processing and analyzing large datasets while ensuring data integrity and accessibility. Your role will require a Bachelor's degree in Computer Science, Information Technology, or a related field. Along with your academic background, you should have a strong technical skill set, including proficiency in Google Cloud Dataflow and Data Proc, along with a solid understanding of SQL and data modeling concepts. Experience with tools like BigQuery, Cloud Storage, and other GCP services will be essential for this position. Additionally, familiarity with programming languages like Python or Java will be advantageous. In addition to your technical expertise, soft skills are equally important for success in this role. You should possess excellent problem-solving abilities, strong communication skills, and a collaborative mindset to work effectively within a team environment. If you are passionate about leveraging GCP tools to process and analyze data, and if you meet the mandatory skills criteria of GCP Data Proc and Dataflow, we encourage you to share your resume with us at gkarthik@softpathtech.com/careers@softpathtech.com. Join our team and contribute to building efficient and reliable data solutions with cutting-edge technologies.,
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
You will be responsible for designing and optimizing scalable data pipeline architectures and supporting analytics needs across cross-functional teams as a Senior Data Engineer in Hyderabad. Your key responsibilities will include designing, building, and maintaining data pipelines using BigQuery, Python, and SQL, optimizing data flow, automating processes, and scaling infrastructure. You will also develop and manage workflows in Airflow/Cloud Composer and Ascend (or similar ETL tools), implement data quality checks and testing strategies, support CI/CD processes, conduct code reviews, and mentor junior engineers, as well as collaborate with QA/business teams and troubleshoot issues across environments. Your core skills should include proficiency in BigQuery, Python, SQL, Airflow/Cloud Composer, Ascend or similar ETL tools, data integration, warehousing, and pipeline orchestration, data quality frameworks, and incremental load strategies. Additionally, you should have strong experience with GCP or AWS serverless data warehouse environments. Preferred skills for this role include experience with DBT for transformation, Collibra for data governance, and working with unstructured datasets. The qualifications required for this position include a minimum of 5 years in data engineering, a graduate degree in CS, Statistics, or a related field, and strong analytical and SQL expertise.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
We are seeking an experienced software engineer to join our team at Grid Dynamics. The ideal candidate should be proficient in Java, Spring Framework, and either Google Cloud Platform (GCP) or Azure. Hands-on experience with BigQuery, Apache Kafka, and GitHub/GitHub Actions is preferred, along with a strong background in developing RESTful APIs. If you are passionate about working with cutting-edge cloud technologies and building scalable solutions, we would love to connect with you! The ideal candidate should have at least 6-9 years of experience in Java and extensive expertise in the Spring Boot Framework. A strong background working with either MS Azure, Google Cloud Platform (GCP), or AWS is required, along with a solid understanding of data integration patterns and ETL processes. Experience with unit and integration testing (e.g., JUnit, Mockito) is essential for this role, along with knowledge of distributed systems architecture. Strong analytical and problem-solving skills are necessary to tackle complex challenges effectively. Immediate joiners are preferred for this position, and it can be hired across Bangalore, Hyderabad, or Chennai.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a candidate for the role at Wind Pioneers, your main responsibility will be to ensure that our flagship software products are driven by cutting-edge science and methodologies. You will play a crucial role in implementing and refining various analyses related to wind data analysis, wind resource assessments, site design, wake modeling, and other intermediary analyses. The enhancements you make in these analyses will need to be seamlessly integrated into our codebase and thoroughly validated. This position is ideal for a detail-oriented scientist or engineer who is passionate about advancing the state-of-the-art in wind farm design and development. At Wind Pioneers, our vision is to lead the world in designing and evaluating new wind farm sites. We are dedicated to creating a set of tools, approaches, and processes that elevate the technical management of wind farm development to a significantly higher level of sophistication compared to standard industry practices. The company relies on its own software as a testing ground for innovative and advanced methodologies, providing you with a unique opportunity to be at the forefront of wind farm design and development. Your role will involve driving improvements to our software from conceptualization to commercial deployment. This requires a deep understanding of scientific and engineering principles to implement new analytical approaches within our software stack and conduct comprehensive validation studies. Key Responsibilities include two main areas: A. Creating Scientific Services: - Utilizing research findings to enhance the accuracy and efficiency of wind resource assessment processes by incorporating new technologies, methodologies, and data sources. - Generating detailed technical reports, documentation, and presentations to effectively communicate research findings, tool developments, and project outcomes. - Conducting research and development tasks, including validation studies. B. Software Engineering: - Assisting the development team in creating high-quality web applications for wind farm design. - Engaging in Data Engineering using technologies like Postgres, BigQuery, Pub/Sub, and Terraform to build event-driven systems and data lakes, particularly for geospatial data. - Leveraging Python and optionally Rust to develop and maintain performance analysis tools for designing and optimizing multi-GW scale wind farms. Candidate Requirements: - Enthusiasm for wind resources and the role of renewable energy in addressing climate change. - Bachelor's or master's degree in a scientific or engineering discipline from a reputable institution. PhD holders are also encouraged to apply. - 3-5 years of relevant experience, demonstrating independent work and initiative. - Wind industry experience is preferred but not mandatory. - Proficiency with Git and Git Flow is beneficial. - Basic knowledge of software development and Python is advantageous. - Excellent written English skills. - International experience is desirable. - Self-directed and proactive work approach. - Excitement for working in a dynamic, high-growth startup environment. - Positive attitude and passion for wind energy. Wind Pioneers Offering: - Join a focused team with a clear vision dedicated to revolutionizing wind farm project discovery and evaluation. - Utilize Wind Pioneers" advanced in-house tools to design top-tier wind farms. - Contribute to the development of Wind Pioneers" flagship tool while benefiting from using it as an end user. - Learn and collaborate closely with our Product Architect and Senior Engineer. - Enjoy a friendly and relaxed office atmosphere and team culture. - Flexible working conditions. - Competitive salary with the opportunity for a six-monthly bonus through Wind Pioneers" revenue share scheme.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Join Zendesk as a Data Engineering Manager and lead a team of data engineers who deliver meticulously curated data assets to fuel business insights. Collaborate with Product Managers, Data Scientists, and Data Analysts to drive successful implementation of data products. We are seeking a leader with advanced skills in data infrastructure, data warehousing, and data architecture, as well as a proven track record of scaling BI teams. Be a part of our mission to embrace data and analytics and create a meaningful impact within our organization. You will foster the growth and development of a team of data engineers, design, build, and launch new data models and pipelines in production, and act as a player-coach to amplify the effects of your team's work. Foster connections with diverse teams to comprehend data requirements, help develop and support your team in technical architecture, project management, and product knowledge. Define processes for operational excellence in project management and system reliability and set direction for the team to anticipate strategic and scaling-related challenges. Foster a healthy and collaborative culture that embodies our values. What You Bring to the Role: - Bachelor's degree in Computer Science/Engineering or related field. - 7+ years of proven experience in Data Engineering and Data Warehousing. - 3+ years as a manager of data engineering teams. - Proficiency with SQL & any programming language (Python/Ruby). - Experience with Snowflake, BigQuery, Airflow, dbt. - Familiarity with BI Tools (Looker, Tableau) is desirable. - Proficiency in modern data stack and architectural strategies. - Excellent written and oral communication skills. - Proven track record of coaching/mentoring individual contributors and fostering a culture valuing diversity. - Experience leading SDLC and SCRUM/Agile delivery teams. - Experience working with globally distributed teams preferred. Tech Stack: - SQL - Python/Ruby - Snowflake - BigQuery - Airflow - dbt Please note that this position requires physical location in and working from Pune, Maharashtra, India. Zendesk software was built to bring calm to the chaotic world of customer service. We advocate for digital-first customer experiences and strive to create a fulfilling and inclusive workplace experience. Our hybrid working model allows for connection, collaboration, and learning in person at our offices globally, as well as remote work flexibility. Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans. If you require an accommodation to participate in the hiring process, please email peopleandplaces@zendesk.com with your specific request.,
Posted 2 days ago
10.0 - 15.0 years
15 - 25 Lacs
Chennai, Bengaluru
Hybrid
Role Overview Were seeking a highly seasoned Solution Architect with deep expertise in Google Cloud Platform (GCP) and a proven track record in designing data and AI infrastructure tailored to AdTech use cases. You'll be pivotal in building scalable, performant, and privacycompliant systems to support realtime bidding, campaign analytics, customer segmentation, and AIdriven personalization. Key Responsibilities Architect and lead GCPnative solutions for AdTech: realtime bidding (RTB/OpenRTB), campaign analytics, lookalike modeling, and audience segmentation. Design highthroughput data pipelines, eventdriven architectures, and unified audience data lakes leveraging GCP services: BigQuery, Dataflow, Pub/Sub, Cloud Storage, Dataproc, Cloud Composer (Airflow), Dataplex , Vertex AI / AutoML , Cloud Functions , Cloud Run , GKE , Looker , and Apigee Collaborate with ad ops, marketing, and product stakeholders to translate business goals into architecture roadmaps, lead discovery workshops, solution assessments, and architecture reviews in presales and delivery cycles nexusitgroup.comSmartRecruiters. Integrate with thirdparty AdTech/MarTech platforms including DSPs, SSPs, CDPs, DMPs, ad exchanges , identity graphs, and consent/identity resolution systems (e.g. LiveRamp, The Trade Desk, Google Ads Data Hub). Ensure architecture aligns with GDPR, CCPA , IAB TCF and data privacy regulations—support consent management, anonymization, encryption, and access controls. Lead multidisciplinary technical teams (Data Engineering, MLOps, Analytics), enforce best practices in data governance, CI/CD, and MLOps (via Cloud Build , Terraform, Kubeflow/Vertex AI pipelines). Mentor engineers, run architecture reviews, define governance, cost optimization, security strategy and system observability. Conduct handson prototyping and PoCs to validate AI/ML capabilities, rapid experimentation before fullscale implementation Machine Learning Jobs. Tech Stack Expertise & Qualifications 15+ years in technical architecture, consulting, or senior engineering roles (preferably with majority in data & analytics); at least 5+ years handson with GCP architectures Indepth knowledge and handson experience of: GCP data and analytics stack: BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Composer, Dataplex, Cloud Storage AI/ML on GCP: Vertex AI, AI Platform, AutoML , model deployment, inference pipelines Compute frameworks: Cloud Functions , Cloud Run , GKE , Apigee Business intelligence and visualization: Looker Infrastructure as code: Terraform ; CI/CD pipelines: Cloud Build , Git-based workflows Skilled in Python and SQL ; familiarity with Java or Scala is a plus. Experience designing eventdriven architectures, streaming data pipelines, microservices , and APIbased integrations. Proven AdTech domain expertise: programmatic advertising, RTB/OpenRTB, identity resolution, cookieless frameworks, DMPs/CDPs data flows . Proven experience with data governance, encryption, IAM, PII anonymization , privacy-enhancing tech. Strong ability to code prototypes or PoCs to solve client challenges quickly, with high-quality architectural foundations Excellent communication skills, able to clearly present complex designs to both technical and nontechnical audiences.
Posted 2 days ago
2.0 - 6.0 years
8 - 12 Lacs
Gurugram
Work from Office
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career, Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express, Join Team Amex and let's lead the way together, From building next-generation apps and microservices in Kotlin to using AI to help protect our franchise and customers from fraud, you could be doing entrepreneurial work that brings our iconic, global brand into the future As a part of our tech team, we could work together to bring ground-breaking and diverse ideas to life that power our digital systems, services, products and platforms If you love to work with APIs, contribute to open source, or use the latest technologies, well support you with an open environment and learning culture Function Description: American Express is looking for energetic, successful and highly skilled Engineers to help shape our technology and product roadmap Our Software Engineers not only understand how technology works, but how that technology intersects with the people who count on it every day Today, innovative ideas, insight and new points of view are at the core of how we create a more powerful, personal and fulfilling experience for our customers and colleagues, with batch/real-time analytical solutions using ground-breaking technologies to deliver innovative solutions across multiple business units, This Engineering role is based in our Global Risk and Compliance Technology organization and will have a keen focus on platform modernization, bringing to life the latest technology stacks to support the ongoing needs of the business as well as compliance against global regulatory requirements Qualifications: Support the Compliance and Operations Risk data delivery team in India to lead and assist in the design and actual development of applications, Responsible for specific functional areas within the team, this involves project management and taking business specifications, The individual should be able to independently run projects/tasks delegated to them, Technology Skills: Bachelor degree in Engineering or Computer Science or equivalent 2 to 5 years experience is required GCP professional certification Data Engineer Expert in Google BigQuery tool for data warehousing needs, Experience on Big Data (Spark Core and Hive) preferred Familiar with GCP offerings, experience building data pipelines on GCP a plus Hadoop Architecture, having knowledge on Hadoop, Map Reduce, Hbase, UNIX shell scripting experience is good to have Creative problem solving (Innovative) We back you with benefits that support your holistic well-being so you can be and deliver your best This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law, Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations, Show
Posted 2 days ago
4.0 - 9.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Job Posting TitleSR. DATA SCIENTIST Band/Level5-2-C Education ExperienceBachelors Degree (High School +4 years) Employment Experience5-7 years At TE, you will unleash your potential working with people from diverse backgrounds and industries to create a safer, sustainable and more connected world. Job Overview Solves complex problems and help stakeholders make data- driven decisions by leveraging quantitative methods, such as machine learning. It often involves synthesizing large volume of information and extracting signals from data in a programmatic way. Roles & Responsibilities Key Responsibilities Design, train, and evaluate supervised & unsupervised models (regression, classification, clustering, uplift). Apply automated hyperparameter optimization (Optuna, HyperOpt) and interpretability techniques (SHAP, LIME). Perform deep exploratory data analysis (EDA) to uncover patterns & anomalies. Engineer predictive features from structured, semistructured, and unstructured data; manage feature stores (Feast). Ensure data quality through rigorous validation and automated checks. Build hierarchical, intermittent, and multiseasonal forecasts for thousands of SKUs. Implement traditional (ARIMA, ETS, Prophet) and deeplearning (RNN/LSTM, TemporalFusion Transformer) approaches. Reconcile forecasts across product/category hierarchies; quantify accuracy (MAPE, WAPE) and bias. Establish model tracking & registry (MLflow, SageMaker Model Registry). Develop CI/CD pipelines for automated retraining, validation, and deployment (Airflow, Kubeflow, GitHub Actions). Monitor data & concept drift; trigger retuning or rollback as needed. Design and analyze A/B tests, causal inference studies, and Bayesian experiments. Provide statisticallygrounded insights and recommendations to stakeholders. Translate business objectives into datadriven solutions; present findings to exec & nontech audiences. Mentor junior data scientists, review code/notebooks, and champion best practices. Desired Candidate Minimum Qualifications M.S. in Statistics (preferred) or related field such as Applied Mathematics, Computer Science, Data Science. 5+ years building and deploying ML models in production. Expertlevel proficiency in Python (Pandas, NumPy, SciPy, scikitlearn), SQL, and Git. Demonstrated success delivering largescale demandforecasting or timeseries solutions. Handson experience with MLOps tools (MLflow, Kubeflow, SageMaker, Airflow) for model tracking and automated retraining. Solid grounding in statistical inference, hypothesis testing, and experimental design. Preferred / NicetoHave Experience in supplychain, retail, or manufacturing domains with highgranularity SKU data. Familiarity with distributed data frameworks (Spark, Dask) and cloud data warehouses (BigQuery, Snowflake). Knowledge of deeplearning libraries (PyTorch, TensorFlow) and probabilistic programming (PyMC, Stan). Strong datavisualization skills (Plotly, Dash, Tableau) for storytelling and insight communication. Competencies ABOUT TE CONNECTIVITY TE Connectivity plc (NYSETEL) is a global industrial technology leader creating a safer, sustainable, productive, and connected future. Our broad range of connectivity and sensor solutions enable the distribution of power, signal and data to advance next-generation transportation, energy networks, automated factories, data centers, medical technology and more. With more than 85,000 employees, including 9,000 engineers, working alongside customers in approximately 130 countries, TE ensures that EVERY CONNECTION COUNTS. Learn more atwww.te.com and onLinkedIn , Facebook , WeChat, Instagram and X (formerly Twitter). WHAT TE CONNECTIVITY OFFERS: We are pleased to offer you an exciting total package that can also be flexibly adapted to changing life situations - the well-being of our employees is our top priority! Competitive Salary Package Performance-Based Bonus Plans Health and Wellness Incentives Employee Stock Purchase Program Community Outreach Programs / Charity Events IMPORTANT NOTICE REGARDING RECRUITMENT FRAUD TE Connectivity has become aware of fraudulent recruitment activities being conducted by individuals or organizations falsely claiming to represent TE Connectivity. Please be advised that TE Connectivity never requests payment or fees from job applicants at any stage of the recruitment process. All legitimate job openings are posted exclusively on our official careers website at te.com/careers, and all email communications from our recruitment team will come only from actual email addresses ending in @te.com . If you receive any suspicious communications, we strongly advise you not to engage or provide any personal information, and to report the incident to your local authorities. Across our global sites and business units, we put together packages of benefits that are either supported by TE itself or provided by external service providers. In principle, the benefits offered can vary from site to site.
Posted 2 days ago
2.0 - 6.0 years
12 - 16 Lacs
Bengaluru
Work from Office
Job Description Some careers have more impact than others, If youre looking for a career where you can make a real impression, join HSBC and discover how valued youll be, HSBC is one of the largest banking and financial services organizations in the world, with operations in 62 countries and territories We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realize their ambitions, We are currently seeking an experienced professional to join our team in the role of Senior Analyst Business Consulting Principal Responsibilities The role is specific to the Risk Trigger team which works with the Banking & Financial Crime Risk (BFCR) business to monitor and flag risk associated in the area, Maintain and deliver existing triggers along with focus on identifying and working on new risk areas Adoption to the latest CIB D&A platform and tooling (SQL/Python/Pyspark/DSW etc) Automation of existing book of work Provide support for analytical/strategic projects, Work with the wider Wholesale Business Risk(WBR) team to support ongoing initiatives and POC The jobholder will Provide analytical support and timely delivery of insights to Banking & Financial Crime Risk (BFCR) business to monitor and flag risk associated in the area, Exploratory data analysis, data quality checks, application of basic statistics to help in data driven decision making Post implementation review of ongoing projects/trigger Trend Analysis and Dashboard Creation based on Visualization technique Participate in projects leading to solutions using various analytic techniques, Execute the assigned projects/ analysis as per the agreed timelines and with accuracy and quality, Produce high quality data and reports which support process improvements, decision-making and achievement of performance targets across the respective Business Areas, Complete analysis as required and document results and formally present findings to management Responsible for developing and executing Business Intelligence / analytical initiatives in line with the objectives laid-down by business, using from different structured and unstructured data sources, Requirements Basic data & analytics experience or equivalent Knowledge and understanding of financial services preferred Bachelors or Masters degree from reputed university in Maths/ Stat or other numerical discipline Concentration on Computers, Science or other fields such as engineering Familiarity with analytic systems with SQL and Python skills Strong Microsoft suite skills Qlik Sense, Big Query and GCP knowledge Strong analytical skills and detail oriented Understand basic data quality management principles Good communication skills in both written and spoken Ability to develop and effectively communicate complex concepts and ideas Ability to work in cross-functional teams Strong interpersonal skills and drive for success Independent worker with high drive and can support Global teams with demanding work hour Analytical thought process and aptitude for problem solving Understand business requirements enough to be able to produce, automate, analyses and interpret analysis reports and support the compliance and regulatory requirements Responsible for effectively deliver the projects within timeline and at desired quality level Youll achieve more at HSBC HSBC is an equal opportunity employer committed to building a culture where all employees are valued, respected and opinions count We take pride in providing a workplace that fosters continuous professional development, flexible working and, opportunities to grow within an inclusive and diverse environment We encourage applications from all suitably qualified persons irrespective of, but not limited to, their gender or genetic information, sexual orientation, ethnicity, religion, social status, medical care leave requirements, political affiliation, people with disabilities, color, national origin, veteran status, etc, We consider all applications based on merit and suitability to the role ? Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website, Issued By HSBC Electronic Data Processing (India) Private LTD*** Show
Posted 2 days ago
5.0 - 10.0 years
8 - 12 Lacs
Hyderabad
Work from Office
ql-editor "> Senior Site Reliability Engineer - JD As a Senior Site Reliability Engineer (SRE) , you will collaborate closely with our Development and IT teams to ensure the reliability, scalability, and performance of our applications. You will take ownership of setting and maintaining service-level objectives (SLOs), building robust monitoring and alerting, and continually improving our infrastructure and processes to maximize up time and deliver exceptional customer experience. This role operates at the intersection of development and operations, reinforcing best practices, automating solutions, and reducing toil across systems and platforms. About QualMinds: QualMinds is a global technology company dedicated to empowering clients on their digital transformation journey. We help our clients to design & develop world-class digital products, custom softwares and platforms. Our primary focus is delivering enterprise grade interactive software applications across web, desktop, mobile, and embedded platforms. Responsibilities: 1. Ensure Reliability & Performance : Own the observability of our systems, ensuring they meet established service-level objectives (SLOs) and maintain high availability. 2. Cloud & Container Orchestration : Deploy, configure, and manage resources on Google Cloud Platform (GCP) and Google Kubernetes Engine (GKE), focusing on secure and scalable infrastructures. 3. Infrastructure Automation & Tooling : Set up and maintain automated build and deployment pipelines; drive continuous improvements to reduce manual work and risks. 4. Monitoring & Alerting : Develop and refine comprehensive monitoring solutions (performance, uptime, error rates, etc.) to detect issues early and minimize downtime. 5. Incident Management & Troubleshooting : Participate in on-call rotations; manage incidents through resolution, investigate root causes, and create blameless postmortems to prevent recurrences. 6. Collaboration with Development : Partner with development teams to design and release services that are production-ready from day one, emphasizing reliability, scalability, and performance. 7. Security & Compliance : Integrate security best practices into system design and operations; maintain compliance with SOC 2 and other relevant standards. 8. Performance & Capacity Planning : Continuously assess system performance and capacity; propose and implement improvements to meet current and future demands. 9. Technical Evangelism : Contribute to cultivating a culture of reliability through training, documentation, and mentorship across the organization. Requirements : Bachelor s degree in Computer Science, Business Administration, or relevant work experience. A minimum of 5+ years in an SRE, DevOps, or similar role in an IT environment, required . Hands-on experience with Microsoft SQL Clusters, Elasticsearch, Kubernetes, required . Deep familiarity with Windows or Linux environments and .NET or PHP stack applications, including IIS/Apache, SQL Server/MySQL, etc. Strong understanding of networking, firewalls, intrusion detection, and security best practices. Proven administrative experience with tools like GIT, TFS, Bitbucket, and Bamboo for continuous Integration, Delivery, and Deployment. Knowledge of automation testing tools such as SonarQube, Selenium, or comparable technologies. Experience with performance profiling, logging, metrics collection, and alerting tools. Competence in debugging solutions across diverse environments. Hands-on experience with GCP, AWS, or Azure, container orchestration (Kubernetes), and microservices-based architectures. Understanding of authentication, authorization, OAUTH, SAML, encryption (public/private key, symmetric, asymmetric), token validation, and SSO. Familiarity with security strategies to optimize performance while maintaining compliance (e.g., SOC 2). Willingness to participate in an on-call rotation and respond to system emergencies 24/7 when necessary. Monthly weekend rotation for Production Patching. A+, MCP, Dell certifications and Microsoft office expertise are a plus!
Posted 2 days ago
5.0 - 10.0 years
0 - 0 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Cloud Composer,GCP DMS,Apache airflow,Java,Python,Scala,GCP Datastream,Google Analytics Hub,GCP Workflows,GCP Dataform,GCP Datafusion,GCP Pub/Sub,ANSI-SQL,GCP Dataflow,GCP Data Flow,GCP Cloud Pub/Sub,Big Data Hadoop Ecosystem Good to Have Skills : Apache airflow, Python, Scala
Posted 2 days ago
2.0 - 5.0 years
11 - 14 Lacs
Bengaluru
Work from Office
Note: By applying to this position you will have an opportunity to share your preferred working location from the following: Bengaluru, Karnataka, India; Gurugram, Haryana, India Minimum qualifications: Bachelor's degree or equivalent practical experience, 4 years of experience in working with C-level executives and cross-functionally across all levels of management, Preferred qualifications: Master's degree in Business Administration or a related field, Experience in brand/retail category management, retail business, gift card segment, payments or digital content, Experience in managing data sets and working with Salesforce, SQL queries, Visual Basic, Google Apps Script, BigQuery, Knowledge of the payment landscape in India with digital content consumption and the mobile gaming industry, Ability to engage with cross-functional leadership and communicate across a changing team, Ability to analyze and synthesize performance data and drive towards insights, About the jobIn this role, you will work with retailers, distribution and payment partners to continue to build Google Plays gift card/recharge code and emerging payments businesses You will work not just with external partners, but also across with the Play cross-functional teams in-country and within the region You will balance multiple priorities, develop, and execute marketing plans, work with internal and external partners, and analyze data to inform decisions You will engage with stakeholders, providing essential analysis for planning, decision-making, and performance management You will require investigative skills, meticulous attention to detail, and the ability to communicate across all levels of the business Google Play offers music, movies, books, apps and games for devices, powered by the cloud It syncs across devices and on the web As part of the Android and Mobile team, Googlers working on Google Play do everything from engineering our backend systems, to shaping product strategy, to forming great content partnerships They make it possible for people to do things like buy an ebook or song on their Android phone, then have it instantly available on their laptop The Google Play team enhances the Android ecosystem by giving developers and partners a premium store where they can reach millions of users, Responsibilities Assess and evaluate the payment landscape in India and propose/identify opportunities to increase payment adoption, drive consumer spend and new paying users Manage partnerships to drive business growth ( e-g , gift card/recharge codes) in payment partner channels, including physical and digital retailers, Build and execute against go to market strategies for new product launches including implementing promotional strategies, analyzing data across merchants, product, promotion type and other variables to optimize performance, and identify and present back on new payment trends, Identify and implement solutions to streamline reporting processes, prepare and stage data for planning, and perform analysis to deliver data-motivated recommendations, Partner cross-functionally across marketing, business operations to execute against cross-functional plans including amplifying major developer moments, major Play moments, launching new Play products, Google is proud to be an equal opportunity workplace and is an affirmative action employer We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status We also consider qualified applicants regardless of criminal histories, consistent with legal requirements See also Google's EEO Policy and EEO is the Law If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form , Show
Posted 2 days ago
7.0 - 12.0 years
15 - 30 Lacs
Hyderabad, Pune, Gurugram
Work from Office
We are Hiring: Senior GCP Data Engineer Location: Hyderabad or Gurugram, India Experience Required: 8+ Years Company: GSPANN Technologies Apply: Send your resume to heena.ruchwani@gspann.com Job Description & Responsibilities: GSPANN Technologies is seeking a highly skilled Senior GCP Data Engineer with strong expertise in database development and cloud-native data engineering. The ideal candidate will have hands-on experience in GCP environments and a passion for building scalable data solutions. Required Skills: 8+ years of experience in database development and support in GCP environments Proficiency in BigQuery SQL, stored procedures, and debugging Strong understanding of indexing, integrity checks, configuration, patching, and statistics Advanced SQL development skills (stored procedures, functions, tables, views, triggers, indexes, constraints) Experience building cloud-native data pipelines using tools like Dagster Solid documentation skills Experience in client-facing projects Python knowledge is a plus Responsibilities: Perform technical analysis and implement customized applications Participate in Agile ceremonies: scrums, sprint planning, reviews, demos, retrospectives Provide accurate effort estimates for solutions and fixes Communicate proactively with stakeholders and support production activities Suggest improvements for system productivity, scaling, and monitoring Manage deployments using recommended tools and methodologies Follow coding best practices and conduct code reviews Share regular updates and status reports with management Collaborate effectively with team members and cross-functional teams
Posted 2 days ago
6.0 - 11.0 years
30 - 40 Lacs
Hyderabad, Bengaluru
Hybrid
Job Summary: We are looking for a skilled Full Stack Developer with strong expertise in Java , React , Progressive Web Applications (PWA) , and SQL/BigQuery . The ideal candidate should have solid experience in Test-Driven Development (TDD) and hands-on skills in JUnit , writing unit and integration test cases. Exposure to cloud-based analytics, AI/ML tools, and low-code platforms will be an added advantage. Key Responsibilities: Develop robust backend services using Java , integrating with relational and cloud-based databases. Build modern and responsive frontend applications using React and Progressive Web App (PWA) technologies. Write clean, maintainable, and well-tested code using TDD practices . Design and implement unit and integration test cases using JUnit . Collaborate with data teams to query and analyze large datasets using SQL and BigQuery . Integrate and visualize data with tools like Looker . Ensure performance, scalability, and reliability of the application. Collaborate in Agile/Scrum development environments and participate in code reviews and design discussions. Must-Have Skills: Strong hands-on experience with Java (Spring Boot preferred) . Expertise in React.js and component-based frontend development. Experience building Progressive Web Applications (PWA) . Proficient in SQL and working knowledge of Google BigQuery . Solid understanding and experience in Test-Driven Development (TDD) . Strong hands-on knowledge of JUnit . Experience writing unit tests , integration tests , and understanding of test coverage best practices.
Posted 2 days ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
Introducing Thinkproject Platform Pioneering a new era and offering a cohesive alternative to the fragmented landscape of construction software, Thinkproject seamlessly integrates the most extensive portfolio of mature solutions with an innovative platform, providing unparalleled features, integrations, user experiences, and synergies, By combining information management expertise and in-depth knowledge of the building, infrastructure, and energy industries, Thinkproject empowers customers to efficiently deliver, operate, regenerate, and dispose of their built assets across their entire lifecycle through a Connected Data Ecosystem, We are seeking a hands-on Applied Machine Learning Engineer to join our team and lead the development of ML-driven insights from historical data in our contracts management, assets management and common data platform This individual will work closely with our data engineering and product teams to design, develop, and deploy scalable machine learning models that can parse, learn from, and generate value from both structured and unstructured contract data, You will use BigQuery and its ML capabilities (including SQL and Python integrations) to prototype and productionize models across a variety of NLP and predictive analytics use cases Your work will be critical in enhancing our platforms intelligence layer, including search, classification, recommendations, and risk detection, What your day will look like Key Responsibilities Model Development: Design and implement machine learning models using structured and unstructured historical contract data to support intelligent document search, clause classification, metadata extraction, and contract risk scoring, BigQuery ML Integration: Build, train, and deploy ML models directly within BigQuery using SQL and/or Python, leveraging native GCP tools ( e-g , Vertex AI, Dataflow, Pub/Sub), Data Preprocessing & Feature Engineering: Clean, enrich, and transform raw data ( e-g , legal clauses, metadata, audit trails) into model-ready features using scalable and efficient pipelines, Model Evaluation & Experimentation: Conduct experiments, model validation, A/B testing, and iterate based on precision, recall, F1-score, RMSE, etc Deployment & Monitoring: Operationalize models in production environments with monitoring, retraining pipelines, and CI/CD best practices for ML (MLOps), Collaboration: Work cross-functionally with data engineers, product managers, legal domain experts, and frontend teams to align ML solutions with product needs, What you need to fulfill the role Skills And Experience Education: Bachelors or Masters degree in Computer Science, Machine Learning, Data Science, or a related field, ML Expertise: Strong applied knowledge of supervised and unsupervised learning, classification, regression, clustering, feature engineering, and model evaluation, NLP Experience: Hands-on experience working with textual data, especially in NLP use cases like entity extraction, classification, and summarization, GCP & BigQuery: Proficiency with Google Cloud Platform, especially BigQuery and BigQuery ML; comfort querying large-scale datasets and integrating with external ML tooling, Programming: Proficient in Python and SQL; familiarity with libraries such as Scikit-learn, TensorFlow, PyTorch, Keras, MLOps Knowledge: Experience with model deployment, monitoring, versioning, and ML CI/CD best practices, Data Engineering Alignment: Comfortable working with data pipelines and tools like Apache Beam, Dataflow, Cloud Composer, and pub/sub systems, Version Control: Strong Git skills and experience collaborating in Agile teams, Preferred Qualifications Experience working with contractual or legal text datasets, Familiarity with document management systems, annotation tools, or enterprise collaboration platforms, Exposure to Vertex AI, LangChain, RAG-based retrieval, or embedding models for Gen AI use cases, Comfortable working in a fast-paced, iterative environment with changing priorities, What we offer Lunch 'n' Learn Sessions I Women's Network I LGBTQIA+ Network I Coffee Chat Roulette I Free English Lessons I Thinkproject Academy I Social Events I Volunteering Activities I Open Forum with Leadership Team (Tp Caf) I Hybrid working I Unlimited learning We are a passionate bunch here To join Thinkproject is to shape what our company becomes We take feedback from our staff very seriously and give them the tools they need to help us create our fantastic culture of mutual respect We believe that investing in our staff is crucial to the success of our business, Your contact: Mehal Mehta Please submit your application, including salary expectations and potential date of entry, by submitting the form on the next page, Working at thinkproject think career think ahead, Show
Posted 2 days ago
9.0 - 14.0 years
7 - 14 Lacs
Hyderabad, Pune
Hybrid
Role & responsibilities Key Skills Required are 8 years of handson experience in cloud application architecture with a focus on creating scalable and reliable software systems 8 Years Experience using Google Cloud Platform GCP including but not restricting to services like Bigquery Cloud SQL Fire store Cloud Composer Experience on Security identity and access management Networking protocols such as TCPIP and HTTPS Network security design including segmentation encryption logging and monitoring Network topologies load balancing and segmentation Python for Rest APIs and Microservices Design and development guidance Python with GCP Cloud SQLPostgreSQL BigQuery Integration of Python API to FE applications built on React JS Unit Testing frameworks Python unit test pytest Java junit spock and groovy DevOps automation process like Jenkins Docker deployments etc Code Deployments on VMs validating an overall solution from the perspective of Infrastructure performance scalability security capacity and create effective mitigation plans Automation technologies Terraform or Google Cloud Deployment Manager Ansible Implementing solutions and processes to manage cloud costs Experience in providing solution to Web Applications Requirements and Design knowledge React JS Elastic Cache GCP IAM Managed Instance Group VMs and GKE Owning the endtoend delivery of solutions which will include developing testing and releasing Infrastructure as Code Translate business requirementsuser stories into a practical scalable solution that leverages the functionality and best practices of the HSBC Executing technical feasibility assessments solution estimations and proposal development for moving identified workloads to the GCP Designing and implementing secure scalable and innovative solutions to meet Banks requirements Ability to interact and influence across all organizational levels on technical or business solutions Certified Google Cloud Architect would be an addon Create and own scaling capacity planning configuration management and monitoring of processes and procedures Create put into practice and use cloudnative solutions Lead the adoption of new cloud technologies and establish best practices for them Experience establishing technical strategy and architecture at the enterprise level Experience leading GCP Cloud project delivery Collaborate with IT security to monitor cloud privacy Architecture DevOps data and integration teams to ensure best practices are followed throughout cloud adoption Respond to technical issues and provide guidance to technical team Skills Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Vertex AI,GCP Spanner,GCP Dataprep,GCP Datastream,Google Analytics Hub,GCP Dataform,GCP Dataplex/Catalog,GCP Cloud Datastore/Firestore,GCP Datafusion,GCP Pub/Sub,GCP Cloud SQL,GCP Cloud Composer,Google Looker,GCP Cloud Datastore,GCP Data Architecture,Google Cloud IAM,GCP Bigtable,GCP Looker1,GCP Data Flow,GCP Cloud Pub/Sub"
Posted 2 days ago
6.0 - 8.0 years
15 - 20 Lacs
Pune, Chennai, Bengaluru
Work from Office
Position Description At CGI, were a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Big Data Developer Location: Bangalore / Hyderabad / Pune / Chennai Experience: 6- 8Years Category: Software Development/ Engineering Main location: Bangalore / Hyderabad /Pune / Chennai Employment Type: Full Time Your future duties and responsibilities • Design and develop scalable data engineering solutions using Google Cloud Platform (GCP) and PySpark. • Optimize Spark jobs for performance, scalability, and efficient resource utilization. • Develop, maintain, and enhance ETL pipelines using BigQuery, Apache Airflow, and Cloud Composer. • Collaborate with data scientists, analysts, and DevOps teams to translate business requirements into technical solutions. • Ensure data integrity and security by implementing data governance, compliance, and security best practices. • Monitor production workloads, troubleshoot performance issues, and implement enhancements. • Implement and enforce coding standards, best practices, and performance tuning strategies. • Support migration activities from on-premises data warehouses to GCP-based solutions. • Mentor junior developers and contribute to knowledge-sharing within the team. • Stay up to date with emerging cloud technologies, tools, and best practices in the data engineering ecosyste Required qualifications to be successful in this role Skills : 5 years of Experience in Big Data, Hadoop, Spark, Python, PySpark, Hive, SQL Location : Pune / Bangalore / Chennai / Hyderabad Education : BE / B.Tech / MCA / BCA
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
BigQuery, a powerful cloud-based data warehouse provided by Google Cloud, is in high demand in the job market in India. Companies are increasingly relying on BigQuery to analyze and manage large datasets, driving the need for skilled professionals in this area.
The average salary range for BigQuery professionals in India varies based on experience level. Entry-level positions may start at around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.
In the field of BigQuery, a typical career progression may include roles such as Junior Developer, Developer, Senior Developer, Tech Lead, and eventually moving into managerial positions such as Data Architect or Data Engineering Manager.
Alongside BigQuery, professionals in this field often benefit from having skills in SQL, data modeling, data visualization tools like Tableau or Power BI, and cloud platforms like Google Cloud Platform or AWS.
As you explore opportunities in the BigQuery job market in India, remember to continuously upskill and stay updated with the latest trends in data analytics and cloud computing. Prepare thoroughly for interviews by practicing common BigQuery concepts and showcase your hands-on experience with the platform. With dedication and perseverance, you can excel in this dynamic field and secure rewarding career opportunities. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough