Jobs
Interviews

3228 Looker Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Business/Data Analyst Role Overview: As a Business/Data Analyst, you will be the analytical backbone of our Academics team. Your role goes beyond numbers — you will uncover insights, build intelligent dashboards, improve academic processes, and enable data-driven decision-making across functions. With a deep understanding of data management and visualization, you will drive performance, identify problems, and shape the strategic direction of academic operations at scale. What You’ll Do: • Build Dashboards & Reporting Tools: Develop real-time dashboards and automated reports to track key academic metrics, performance indicators, and team-level productivity. • Uncover Insights: Analyze large volumes of academic data to identify trends, problem areas, and opportunities for improvement in student engagement, outcomes, and curriculum effectiveness. • Data Management: Set up and maintain clean, organized databases — ensuring data accuracy, completeness, and consistency across platforms. • Process Optimization: Deep-dive into existing academic workflows to identify inefficiencies, design better processes, and monitor improvements through measurable KPIs. • Visualization & Storytelling: Transform complex data into intuitive visualizations and presentations that help teams understand performance and take timely action. • Cross-functional Collaboration: Work closely with the Product, Tech, and Academic Operations teams to ensure data alignment and to integrate insights into everyday decision-making. What Success Looks Like: • Clear, accessible dashboards in use across academic teams for decision-making. • Actionable insights delivered regularly, leading to measurable improvements in academic outcomes. • Improved academic processes, measured by reduced inefficiencies and improved student performance metrics. • High stakeholder satisfaction with reporting accuracy, accessibility, and utility. • Increased data maturity within the Academics team — from intuition-led to insight-led decision-making. What We’re Looking For: • 2–5 years of experience as a business analyst, data analyst, or in a similar role involving data-driven problem-solving. • Strong proficiency in Google Sheets is a must — including advanced formulas, pivot tables, data cleaning, automation, and dashboarding. • Experience with SQL and at least one data visualization platform (e.g., Power BI, Tableau, Looker, or Google Data Studio) is a plus. • Deep analytical thinking with the ability to translate complex data into clear insights. • Experience working cross-functionally with non-technical stakeholders and product/tech teams. • Strong attention to detail, data hygiene, and process discipline.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Required Skills: 3 + years of hands-on experience in data modeling, ETL processes, developing reporting systems and data engineering (e.g., ETL, Big Query, SQL, Python or Alteryx) Advanced knowledge in SQL programming and database management. 3 + years of solid experience with one or more Business Intelligence reporting tools such as Power BI, Qlik Sense, Looker or Tableau. Knowledge of data warehousing concepts and best practices. Excellent problem-solving and analytical skills. Detailed oriented with strong communication and collaboration skills. Ability to work independently and as part of a team. Preferred Skills: Experience with GCP cloud services including BigQuery, Cloud Composer, Dataflow, CloudSQL, Looker and Looker ML, Data Studio, GCP QlikSense. Strong SQL skills and various BI/Reporting tools to build self-serve reports, analytic dashboards and ad-hoc packages leveraging our enterprise data warehouse. 1+ year experience with Python. 1+ year experience with Hive/Spark/Scala/JavaScript. Strong experience consuming data models, developing SQL, addressing data quality issues, proposing BI solution architecture, articulating best practices in end-user visualizations. Development delivery experience. Solid understanding of BI tools, architectures, and visualization solutions. Inquisitive, proactive, and interested in learning new tools and techniques. Strong oral, written and interpersonal communication skills Comfortable working in a dynamic environment where problems are not always well-defined. Responsibilities Develop and maintain data pipelines, reporting and dashboards using SQL and Business Intelligence reporting tools such as Power BI, Qlik Sense and Looker. Develop and execute database queries by applying advanced knowledge of SQL and experience working with relational databases and Google BigQuery. Collaborate with stakeholders to define requirements from problem statements and develop data-driven insights. Perform data validation and code review to assure data accuracy and data quality/integrity across all systems. Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Qualifications Bachelor's degree in Computer Science, Computer Information Systems, or related field. 3 + years of hands-on experience in data modeling, ETL processes, developing reporting systems and data engineering (e.g., ETL, Big Query, SQL, Python or Alteryx) Advanced knowledge in SQL programming and database management. 3 + years of solid experience with one or more Business Intelligence reporting tools such as Power BI, Qlik Sense, Looker or Tableau. Knowledge of data warehousing concepts and best practices. Excellent problem-solving and analytical skills. Detailed oriented with strong communication and collaboration skills. Ability to work independently and as part of a team.

Posted 1 week ago

Apply

3.0 years

0 Lacs

India

Remote

A digital marketing agency in India seeks a PPC specialist with 3-5 years of experience. You will manage paid ad campaigns and related activities, such as setup, keyword strategies, ad copies, social ads, and other digital marketing activities. The best candidate for this position is flexible, a self-starter, efficient, creative, and able to consider small details while looking at the big picture. Responsibilities Management of 35+ PPC accounts and creating a strategy. Strategically manage and optimize paid search campaigns for customers, create data-driven reports, and present insights to the customer success team Daily management and implementation of paid search campaigns across major search engines and social networks Communicate strategy and results to clients in reports and during meetings Suggest to the account management team the opportunities grow accounts Strategic communication and timely, detailed account audits and reviews on a daily/ weekly and bi-weekly basis as scoped Key Skills And Experience 3+ years in a paid advertising implementor role and working with clients Previous success in digital campaign management, working with clients and, client success managers, project managers to understand customers' business goals and create effective Paid ads strategies Proficient at keyword research, writing compelling and conversion-friendly ad copies, and building strategic campaigns Excellent communication skills and problem-solving skills Previous success in digital campaign management, working with clients and, client success managers, project managers to understand customers' business goals and create effective Paid ads strategy Able to stay on top of industry trends and advancements and understand how these can be applied to paid campaigns Strong ability to analyze data and high attention to detail A team player who enjoys working in a fast-paced environment Qualifications And Skills 3+ years of hands-on PPC experience In-depth knowledge of analyzing key metrics in Digital Marketing, conversion, and online customer acquisition in multiple industries and platforms. Google AdWords Certified Advanced understanding and experience with website analytics tools (e.g., Google Analytics; GA4, Tag Manager, Search Console, and Looker Studio) Firm understanding and interest in all aspects of online campaigns from start to finish. Working knowledge of popular content management systems (Joomla, WordPress). Knowledge of HTML, CSS, and JavaScript is a bonus. Up to date with the latest trends and best practices in SEM and AI-driven tools Agency experience, Google Analytics, and AdWords certifications are required. Previous experience with Moz, Ahrefs, SpyFu, or SEMrush is an asset. Application information: We offer a friendly work environment competitive salary. Please be sure to include a resume and cover letter and state your expected salary. We appreciate all applicants for their interest. Only those selected for an interview will be contacted. Job Type: Full–time Location: This is a Work from Home position Minimum Required experience: Paid Ads Management in an agency environment - 3+ years Minimum Education: Preferably a Bachelor's in Business, BBA, Economics, and/or Computer Science Powered by JazzHR XjOuRVnt9l

Posted 1 week ago

Apply

12.0 years

27 - 35 Lacs

Madurai, Tamil Nadu, India

On-site

Job Title: GCP Data Architect Location: Madurai Experience: 12+ Years Notice Period: Immediate About TechMango TechMango is a rapidly growing IT Services and SaaS Product company that helps global businesses with digital transformation, modern data platforms, product engineering, and cloud-first initiatives. We are seeking a GCP Data Architect to lead data modernization efforts for our prestigious client, Livingston, in a highly strategic project. Role Summary As a GCP Data Architect, you will be responsible for designing and implementing scalable, high-performance data solutions on Google Cloud Platform. You will work closely with stakeholders to define data architecture, implement data pipelines, modernize legacy data systems, and guide data strategy aligned with enterprise goals. Key Responsibilities Lead end-to-end design and implementation of scalable data architecture on Google Cloud Platform (GCP) Define data strategy, standards, and best practices for cloud data engineering and analytics Develop data ingestion pipelines using Dataflow, Pub/Sub, Apache Beam, Cloud Composer (Airflow), and BigQuery Migrate on-prem or legacy systems to GCP (e.g., from Hadoop, Teradata, or Oracle to BigQuery) Architect data lakes, warehouses, and real-time data platforms Ensure data governance, security, lineage, and compliance (using tools like Data Catalog, IAM, DLP) Guide a team of data engineers and collaborate with business stakeholders, data scientists, and product managers Create documentation, high-level design (HLD) and low-level design (LLD), and oversee development standards Provide technical leadership in architectural decisions and future-proofing the data ecosystem Required Skills & Qualifications 10+ years of experience in data architecture, data engineering, or enterprise data platforms Minimum 3–5 years of hands-on experience in GCP Data Service Proficient in:BigQuery, Cloud Storage, Dataflow, Pub/Sub, Composer, Cloud SQL/Spanner Python / Java / SQL Data modeling (OLTP, OLAP, Star/Snowflake schema) Experience with real-time data processing, streaming architectures, and batch ETL pipelines Good understanding of IAM, networking, security models, and cost optimization on GCP Prior experience in leading cloud data transformation projects Excellent communication and stakeholder management skills Preferred Qualifications GCP Professional Data Engineer / Architect Certification Experience with Terraform, CI/CD, GitOps, Looker / Data Studio / Tableau for analytics Exposure to AI/ML use cases and MLOps on GCP Experience working in agile environments and client-facing roles What We Offer Opportunity to work on large-scale data modernization projects with global clients A fast-growing company with a strong tech and people culture Competitive salary, benefits, and flexibility Collaborative environment that values innovation and leadership Skills:- Google Cloud Platform (GCP), GCP Data, Architect and Data architecture

Posted 1 week ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Hyderabad, Ahmedabad, Bengaluru

Work from Office

SUMMARY Sr. Data Analytics Engineer Databricks - Power mission-critical decisions with governed insight s Company: Ajmera Infotech Private Limited (AIPL) Location: Ahmedabad, Bangalore /Bengaluru, Hyderabad (On-site) Experience: 5 9 years Position Type: Full-time, Permanent Ajmera Infotech builds planet-scale software for NYSE-listed clients, driving decisions that can’t afford to fail. Our 120-engineer team specializes in highly regulated domains HIPAA, FDA, SOC 2 and delivers production-grade systems that turn data into strategic advantage. Why You’ll Love It End-to-end impact Build full-stack analytics from lake house pipelines to real-time dashboards. Fail-safe engineering TDD, CI/CD, DAX optimization, Unity Catalog, cluster tuning. Modern stack Databricks, PySpark , Delta Lake, Power BI, Airflow. Mentorship culture Lead code reviews, share best practices, grow as a domain expert. Mission-critical context Help enterprises migrate legacy analytics into cloud-native, governed platforms. Compliance-first mindset Work in HIPAA-aligned environments where precision matters. Requirements Key Responsibilities Build scalable pipelines using SQL, PySpark , Delta Live Tables on Databricks. Orchestrate workflows with Databricks Workflows or Airflow; implement SLA-backed retries and alerting. Design dimensional models (star/snowflake) with Unity Catalog and Great Expectations validation. Deliver robust Power BI solutions dashboards, semantic layers, paginated reports, DAX. Migrate legacy SSRS reports to Power BI with zero loss of logic or governance. Optimize compute and cost through cache tuning, partitioning, and capacity monitoring. Document everything from pipeline logic to RLS rules in Git-controlled formats. Collaborate cross-functionally to convert product analytics needs into resilient BI assets. Champion mentorship by reviewing notebooks, dashboards, and sharing platform standards. Must-Have Skills 5+ years in analytics engineering, with 3+ in production Databricks/Spark contexts. Advanced SQL (incl. windowing), expert PySpark , Delta Lake, Unity Catalog. Power BI mastery DAX optimization, security rules, paginated reports. SSRS-to-Power BI migration experience (RDL logic replication) Strong Git, CI/CD familiarity, and cloud platform know-how (Azure/AWS). Communication skills to bridge technical and business audiences. Nice-to-Have Skills Databricks Data Engineer Associate cert. Streaming pipeline experience (Kafka, Structured Streaming). dbt , Great Expectations, or similar data quality frameworks. BI diversity experience with Tableau, Looker, or similar platforms. Cost governance familiarity (Power BI Premium capacity, Databricks chargeback). Benefits What We Offer Competitive salary package with performance-based bonuses. Comprehensive health insurance for you and your family.

Posted 1 week ago

Apply

5.0 - 10.0 years

5 - 14 Lacs

Chennai

Hybrid

Position Description: The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to Ford's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required: Power BI Experience Required: Position Qualifications: • Bachelors Degree in a relevant field • At least 5 years of experience with Descriptive Analytics technologies such as Power BI, Qlik Sense, Looker, Looker Studio, and WebFOCUS, or similar platforms • System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs • Some Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations • Strong troubleshooting and problem-solving skills • Understanding of Product Life Cycle • Ability to coordinate issue resolution with vendors on behalf of Ford • Strong written and verbal communication skills • Understanding of technologies like GCP, Azure, Big Query, Teradata, SQL Server, Oracle DB2, etc. • Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA and vault keys, etc.) Experience Preferred: Recommended: • Experience with PowerApps and Power Automate • Familiarity with Jira • Familiarity with Fords EAA, RTP, and EAMS processes and Ford security policies (GRC) Education Required: Bachelor's Degree Education Preferred: Bachelor's Degree Additional Information : Position Duties: • Evaluate and engineer existing and upcoming analytics technologies for Enterprise consumption in both Google Cloud and on-prem environments • Develop Onboarding, Operational and Disaster Recovery procedures • Develop new tools and processes to ensure effective implementation and use of the technologies • Maintain custom installation guides that are consistent with Ford IT security policy • Document day to day processes, installation and desk procedures (Run Books, Operational Manuals, SharePoint, Knowledgebase, etc.) • Engage with customers and power users globally using MS Teams and Viva Engage to assist with (non-software-development) infrastructure and connectivity issues • Monitor and analyze usage data to ensure optimal performance of the infrastructure; implement permanent corrective actions as needed • Provide training, consultation and L2/L3 operational support • Perform regular dashboard and visualization platform administration tasks • Provide SME support to users for issue resolution and follow ITL processes for request, incident, change, event and problem management • Provide product consultation for dedicated deployments 22

Posted 1 week ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Systems Engineering Practitioner Location: Chennai Work Type: Hybrid Position Description: The Analytics Service department provides system planning, engineering and operations support for enterprise Descriptive and Predictive Analytics products, as well as Big Data solutions and Analytics Data Management products. These tools are used by the Global Data Insights and Analytics (GDIA) team, data scientists, and IT service delivery partners globally to build line-of-business applications which are directly used by the end-user community. Products and platforms include Power BI, Alteryx, Informatica, Google Big Query, and more - all of which are critical to the client's rapidly evolving needs in the area of Analytics and Big Data. In addition, business intelligence reporting products such as Business Objects, Qlik Sense and WebFOCUS are used by our core line of businesses for both employees and dealers. This position is part of the Descriptive Analytics team. It is a Full Stack Engineering and Operations position, engineering and operating our strategic Power BI dashboarding and visualization platform and other products as required, such as Qlik Sense, Alteryx, Business Objects, WebFOCUS, Looker, and other new platforms as they are introduced. The person in this role will collaborate with team members to produce well-tested and documented run books, test cases, and change requests, and handle change implementations as needed. The candidate will start with primarily Operational tasks until the products are well understood and will then progress to assisting with Engineering tasks. Skills Required: Power BI Experience Required: Position Qualifications: Bachelors Degree in a relevant field At least 5 years of experience with Descriptive Analytics technologies such as Power BI, Qlik Sense, Looker, Looker Studio, and WebFOCUS, or similar platforms System Administrator experience managing large multi-tenant Windows Server environments based on GCP Compute Engines or OpenShift Virtualization VMs Some Dev/Ops experience with Github, Tekton pipelines, Terraform code, Google Cloud Services, and PowerShell and managing large GCP installations Strong troubleshooting and problem-solving skills Understanding of Product Life Cycle Ability to coordinate issue resolution with vendors on behalf of the client Strong written and verbal communication skills Understanding of technologies like GCP, Azure, Big Query, Teradata, SQL Server, Oracle DB2, etc. Basic understanding of database connectivity and authentication methods (ODBC, JDBC, drivers, REST, WIF, Cloud SA and vault keys, etc.) Experience Preferred: Recommended: Experience with PowerApps and Power Automate Familiarity with Jira Familiarity with the client's EAA, RTP, and EAMS processes and the client security policies (GRC) Education Required: Bachelor's Degree Education Preferred: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 1 week ago

Apply

1.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Location : Vadodara Type : Full-time / Internship Duration (for interns) : Minimum 3 months Stipend/CTC : Based on experience and role About Gururo Gururo is a global leader in practical, career-transforming education. With a mission to equip professionals and students with real-world skills, we specialize in project management, leadership, business analytics, and emerging technologies. Join our dynamic, impact-driven team to work on data-centric products that empower decision-making and transform education at scale. Who Can Apply? Interns: Final-year students or recent graduates in Computer Science, Statistics, Mathematics, Data Science, or related fields with a passion for data storytelling. Freshers: 0–1 years of experience with academic projects or internship exposure to data analysis. Experienced Professionals: 1+ years of hands-on experience in analytics or data roles, with a portfolio or GitHub/Power BI/Tableau profile showcasing analytical thinking. Key Responsibilities Collect, clean, and analyze data from multiple sources to uncover trends and actionable insights Build dashboards and reports using tools like Excel, Power BI, or Tableau Translate business problems into analytical solutions through exploratory data analysis (EDA) Support A/B testing, cohort analysis, and customer segmentation for business decision-making Use SQL to query and manipulate structured data from relational databases Assist in building data pipelines and automation for recurring analytics tasks Communicate findings effectively through visualizations and presentations Collaborate closely with product, marketing, and engineering teams to support data-driven strategies Must-Have Skills Strong proficiency in Excel and SQL Basic to intermediate knowledge of Python for data analysis (Pandas, NumPy) Familiarity with data visualization tools like Power BI, Tableau, or Matplotlib/Seaborn Good understanding of descriptive statistics, data cleaning, and EDA Ability to communicate insights clearly to technical and non-technical stakeholders Experience with Google Analytics, TruConversion, or similar analytics platforms (optional for interns) Good to Have (Optional) Experience with data storytelling, dashboarding, and automation scripts Exposure to R, Looker, or Metabase Familiarity with web/app analytics, conversion funnels, and retention metrics Understanding of data warehousing concepts and basic ETL pipelines GitHub portfolio, published reports, or participation in data hackathons What You’ll Gain Real-world experience in data analytics and business intelligence Mentorship from senior analysts and cross-functional collaboration exposure Certificate of Internship/Experience & Letter of Recommendation (for interns) Opportunity to work on data for educational product growth and user behavior insights Flexible working hours and performance-linked growth Hands-on end-to-end data projects — from raw data to executive insights

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Overview: TekWissen is a global workforce management provider throughout India and many other countries in the world. The below clientis a global company with shared ideals and a deep sense of family. From our earliest days as a pioneer of modern transportation, we have sought to make the world a better place – one that benefits lives, communities and the planet Job Title: Specialty Development Consultant Location: Chennai Work Type: Hybrid Position Description: Train, Build and Deploy ML, DL Models Software development using Python, work with Tech Anchors, Product Managers and the Team internally and across other Teams Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end Software development using TDD approach Experience using GCP products & services Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Skills Required: Python, cloud, GCP Experience Required: 3+ years of experience in Python software development 3+ years experience in Cloud technologies & services, preferably GCP 3+ years of experience of practicing statistical methods and their accurate application e.g. ANOVA, principal component analysis, correspondence analysis, k-means clustering, factor analysis, multi-variate analysis, Neural Networks, causal inference, Gaussian regression, etc. 3+ years experience with Python, SQL, BQ. Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP Looker, Vertex AI, Airflow, TensorFlow, etc., Experience in Train, Build and Deploy ML, DL Models Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Building and deploying Models (Scikit learn, DataRobots, TensorFlow PyTorch, etc.) Developing and deploying On-Prem & Cloud environments Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Experience Preferred: 2 to 5 yrs Education Required: Bachelor's Degree TekWissen® Group is an equal opportunity employer supporting workforce diversity.

Posted 1 week ago

Apply

2.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Fynd is India’s largest omnichannel platform and a multi-platform tech company specializing in retail technology and products in AI, ML, big data, image editing, and the learning space. It provides a unified platform for businesses to seamlessly manage online and offline sales, store operations, inventory, and customer engagement. Serving over 2,300 brands, Fynd is at the forefront of retail technology, transforming customer experiences and business processes across various industries. About The Role We are hiring a Product Growth Manager to lead initiatives at the intersection of retail, AI, and product execution. This role requires someone who can conceptualize and build AI-native features, scale them effectively, and drive adoption through experimentation, data insights, and structured execution. You will work closely with engineering, machine learning, product, and business teams to deliver AI-powered capabilities that solve real-world retail challenges. The ideal candidate has strong technical foundations, sharp product instincts, and a proven ability to operate with speed and ownership in high-growth environments. What will you do at Fynd? Build and launch AI-native product features in collaboration with ML and engineering teams. Drive product-led growth initiatives focused on activation, retention, and adoption. Translate AI/ML capabilities into scalable and intuitive product experiences. Work hands-on with model deployments, inference systems, and API integrations. Own end-to-end product execution—from problem discovery to roadmap delivery and post-launch iteration. Contribute to platform strategies that scale AI features from MVPs to production-grade adoption. Understand and influence product flows specific to retail, catalog systems, and commerce automation. Some Specific Requirements AI and Technical Foundations Strong grasp of LLMs, embeddings, vector databases, RAG, and fine-tuning methods like LoRA and QLoRA. Hands-on experience with OpenAI, Hugging Face, LangChain, LlamaIndex, and production-grade AI tooling. Familiar with AI workflows using FastAPI, Docker, MLflow, and model serving via Ray or TorchServe. Comfortable working with GitHub, Git, and tools for managing models, code, and experiments. Good understanding of microservices, API design (REST/GraphQL), and scalable backend systems. Experienced in CI/CD setup for training and deploying ML models to production. Data and Analytics Proficient in SQL, BigQuery, and Looker Studio for data exploration and dashboards. Able to design KPIs, success metrics, and user journey insights for product analytics. Knowledge of tools like dbt, Airflow, and event-based platforms like PostHog or Mixpanel. Experience with A/B testing, funnel analysis, and behavioral cohort tracking. Retail and Product Execution Solid understanding of retail workflows, product catalogs, and commerce ecosystems. Experience building and scaling digital products with real-world user adoption. Strong product judgment and ability to balance business, user, and technical priorities. Stakeholder Management and Execution Leadership Ability to lead and influence cross-functional teams to drive high-quality execution. Strong communication skills to present AI-driven concepts to both technical and non-technical audiences. Experience with Agile methodologies and tools such as Jira, Confluence, and Asana for planning and tracking. Preferred Experience 2 to 5 years in product, growth, or AI-focused roles. Experience in building and scaling AI-powered or technology-driven platforms. Exposure to retail, e-commerce, or SaaS environments. Track record of delivering outcomes in fast-paced, cross-functional teams. Why Join Us Be part of a team shaping the future of AI-native platforms in digital commerce. Work closely with leading AI engineers, product teams, and business stakeholders. Own and execute high-impact initiatives with autonomy and accountability. Operate in a culture that values speed, clarity, and innovation. If you're someone who thrives on execution, loves solving complex problems, and wants to build the future of AI-native platforms, we’d love to have you on board Growth Growth knows no bounds, as we foster an environment that encourages creativity, embraces challenges, and cultivates a culture of continuous expansion. We are looking at new product lines, international markets and brilliant people to grow even further. We teach, groom and nurture our people to become leaders. You get to grow with a company that is growing exponentially. Flex University We help you upskill by organising in-house courses on important subjects Learning Wallet: You can also do an external course to upskill and grow, we reimburse it for you. Culture Community and Team building activities Host weekly, quarterly and annual events/parties. Wellness Mediclaim policy for you + parents + spouse + kids Experienced therapist for better mental health, improve productivity & work-life balance We work from the office 5 days a week to promote collaboration and teamwork. Join us to make an impact in an engaging, in-person environment

Posted 1 week ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Pune, Bengaluru, Mumbai (All Areas)

Hybrid

Overview of 66degrees 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With our unmatched engineering capabilities and vast industry experience, we help the world's leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing the challenge and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way. Overview of Role We are looking for a highly motivated and experienced Data Analytics Engineer to join our team. As a Data Analytics Engineer, you will be responsible for building and implementing robust data analytics solutions using Microsoft Power BI. You will collaborate with cross-functional teams to gather requirements, design scalable architectures, and deliver high-quality solutions that meet business needs. A key aspect of this role involves ensuring data literacy within the organization, empowering users to understand and validate data effectively. Responsibilities Work with Clients to enable them on the Power BI Platform, teaching them how to construct an analytics ecosystem from the ground up. Advise clients on how to develop their analytics centers of excellence, defining and designing processes to promote a scalable, governed analytics ecosystem. Utilize Microsoft Power BI to design and develop interactive and visually appealing dashboards and reports for end-users. Write clean, efficient, and scalable DAX and M code (Power Query). Conduct performance tuning and optimization of data analytics solutions to ensure efficient processing and query performance. Stay up to date with the latest trends and best practices in cloud data analytics, big data technologies, and data visualization tools. Collaborate with other teams to ensure seamless integration of data analytics solutions with existing systems and processes. Provide technical guidance and mentorship to junior team members, sharing knowledge and promoting best practices. Promote data literacy and data validation best practices across teams. Qualifications 5+ years in Power BI for designing interactive dashboards and reports. Previous experience with the Looker Platform is a plus. Experience working with Google BigQuery and the Google Cloud Platform is a plus. Comprehension of Power BI's security capabilities and supporting multiple personas in the platform. Strong problem-solving skills and the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills with the ability to work effectively in a cross-functional team environment. Demonstrated experience in promoting data literacy and enabling users to understand and validate data. An understanding of conversational analytics and experience working with PowerBI Copilot is a plus. Microsoft Power BI certifications are a plus. Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. 66degrees is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to actual or perceived race, color, religion, sex, gender, gender identity, national origin, age, weight, height, marital status, sexual orientation, veteran status, disability status or other legally protected class.

Posted 2 weeks ago

Apply

10.0 years

1 - 5 Lacs

Hyderābād

On-site

Category Application Development and Support Location Hyderabad, Telangana Job family Architecture Shift Evening Employee type Regular Full-Time Bachelor’s in computer science, Information Systems, or a related field Minimum of 10+ years of experience in data architecture with a minimum of 1-3 years of experience in healthcare domain Strong hands-on experience with Cloud databases such as Snowflake, Aurora, Google BigQuery etc. Experience in designing OLAP and OLTP systems for efficient data analysis and processing. Strong handson experience with enterprise BI/Reporting tools like (Looker, AWS QuickSight, PowerBI, Tableau and Cognos). A strong understanding of HIPAA regulations and healthcare data privacy laws is a must-have for this role, as the healthcare domain requires strict adherence to data privacy and security regulations. Experience in data privacy and tokenization tools like Immuta, Privacera, Privitar OpenText and Protegrity. Experience with multiple full life-cycle data warehouse/transformation implementations in the public cloud (AWS, Azure, and GCP) with Deep technical knowledge in one. Proven experience working as an Enterprise Data Architect or a similar role, preferably in large-scale organizations. Proficient in Data modelling (Star Schema (de-normalized data model), Transactional Model (Normalized data model) using tools like Erwin. Experience with ETL/ETL architecture and integration (Matillion, AWS GLUE, Google PLEX, Azure Data Factory etc) Deep understanding of data architectures that utilize Data Fabric, Data Mesh, and Data Products implementation. Business & financial acumen to advise on product planning, conduct research & analysis, and identify the business value of new and emerging technologies. Strong SQL and database skills working with large structured and unstructured data. Experienced in Implementation of data virtualization, and semantic model driven architecture. System development lifecycle (SDLC), Agile Development, DevSecOps, and standard software development tools such as Git and Jira Excellent written and oral communication skills to convey key choices, recommendations, and technology concepts to technical and non-technical audiences. Familiarity with AI/MLOps concepts and Generative AI technology. View more

Posted 2 weeks ago

Apply

2.0 years

5 - 18 Lacs

India

On-site

Job Title : Data Engineer with Strong Communication Skills Data Engineer who not only excels in building scalable and efficient data pipelines, but also possesses exceptional communication and presentation skills. This role bridges technical execution with cross-functional collaboration, requiring the ability to explain complex data infrastructure in simple terms to technical and non-technical stakeholders alike. Key Responsibilities Data Pipeline Development : Design, build, and maintain robust ETL/ELT pipelines using modern data engineering tools (e.g., Spark, Airflow, dbt). Data Modeling : Develop and optimize data models (star/snowflake schemas) that support analytics and reporting use cases. Collaboration & Communication : Work closely with data analysts, data scientists, and business teams to understand data needs. Deliver clear, engaging presentations of data architecture, pipeline performance, and data quality issues. Translate business requirements into scalable engineering solutions and explain trade-offs in plain language. Data Quality & Governance : Implement and monitor data quality checks and ensure compliance with data governance policies. Documentation : Produce clear technical documentation and present data lineage and infrastructure to non-technical teams. Required Qualifications Bachelor's or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field. 2+ years of experience in data engineering or a related role. Proficiency in SQL, Python, and at least one big data platform (e.g., Hadoop, Spark, Snowflake, Redshift). Experience with orchestration tools (e.g., Apache Airflow) and cloud data services (AWS, GCP, or Azure). Exceptional verbal and written communication skills. Comfortable speaking in team meetings, executive briefings, and client-facing settings. Preferred Qualifications Experience working in a cross-functional, agile team environment. Familiarity with data privacy regulations (GDPR, CCPA). Previous experience presenting at internal town halls, data literacy sessions, or industry meetups. Experience with visualization tools (e.g., Looker, Tableau) is a plus. What Sets You Apart You're as confident writing efficient SQL as you are explaining pipeline latency to a product manager. You thrive at the intersection of data infrastructure and human communication. You believe that data engineering should be as accessible and transparent as it is powerful. Benefits Competitive salary and stock options Flexible work environment Health, dental, and vision insurance Learning & development budget Opportunities to present at internal or external conferences Job Type: Full-time Pay: ₹505,184.34 - ₹1,843,823.89 per year Work Location: In person

Posted 2 weeks ago

Apply

7.0 years

4 - 7 Lacs

Hyderābād

On-site

Job Description: Senior Data Analyst Location: Hyderabad, IN - Work from Office Experience: 7+ Years Role Summary We are seeking an experienced and highly skilled Senior Data Analyst to join our team. The ideal candidate will possess a deep proficiency in SQL, a strong understanding of data architecture, and a working knowledge of the Google Cloud platform (GCP)based ecosystem. They will be responsible for turning complex business questions into actionable insights, driving strategic decisions, and helping shape the future of our Product/Operations team. This role requires a blend of technical expertise, analytical rigor, and excellent communication skills to partner effectively with engineering, product, and business leaders. Key Responsibilities Advanced Data Analysis: Utilize advanced SQL skills to query, analyze, and manipulate large, complex datasets. Develop and maintain robust, scalable dashboards and reports to monitor key performance indicators (KPIs). Source Code management : Proven ability to effectively manage, version, and collaborate on code using codebase management systems like GitHub. Responsible for upholding data integrity, producing reproducible analyses, and fostering a collaborative database management environment through best practices in version control and code documentation. Strategic Insights: Partner with product managers and business stakeholders to define and answer critical business questions. Conduct deep-dive analyses to identify trends, opportunities, and root causes of performance changes. Data Architecture & Management: Work closely with data engineers to design, maintain, and optimize data schemas and pipelines. Provide guidance on data modeling best practices and ensure data integrity and quality. Reporting & Communication: Translate complex data findings into clear, concise, and compelling narratives for both technical and non-technical audiences. Present insights and recommendations to senior leadership to influence strategic decision-making. Project Leadership: Lead analytical projects from end to end, including defining project scope, methodology, and deliverables. Mentor junior analysts, fostering a culture of curiosity and data-driven problem-solving. Required Skills & Experience Bachelor's degree in a quantitative field such as Computer Science, Statistics, Mathematics, Economics, or a related discipline. 5+ years of professional experience in a data analysis or business intelligence role. Expert-level proficiency in SQL with a proven ability to write complex queries, perform window functions, and optimize queries for performance on massive datasets. Strong understanding of data architecture, including data warehousing, data modeling (e.g., star/snowflake schemas), and ETL/ELT principles. Excellent communication and interpersonal skills, with a track record of successfully influencing stakeholders. Experience with a business intelligence tool such as Tableau, Looker, or Power BI to create dashboards and visualizations. Experience with internal Google/Alphabet data tools and infrastructure, such as BigQuery, Dremel, or Google-internal data portals. Experience with statistical analysis, A/B testing, and experimental design. Familiarity with machine learning concepts and their application in a business context. A strong sense of curiosity and a passion for finding and communicating insights from data. Proficiency with scripting languages for data analysis (e.g., App scripting , Python or R ) would be an added advantage Responsibilities Lead a team of data scientists and analysts to deliver data-driven insights and solutions. Oversee the development and implementation of data models and algorithms to support new product development. Provide strategic direction for data science projects ensuring alignment with business goals. Collaborate with cross-functional teams to integrate data science solutions into business processes. Analyze complex datasets to identify trends and patterns that inform business decisions. Utilize generative AI techniques to develop innovative solutions for product development. Ensure adherence to ITIL V4 practices in all data science projects. Develop and maintain documentation for data science processes and methodologies. Mentor and guide team members to enhance their technical and analytical skills. Monitor project progress and adjust strategies to meet deadlines and objectives. Communicate findings and recommendations to stakeholders in a clear and concise manner. Drive continuous improvement in data science practices and methodologies. Foster a culture of innovation and collaboration within the data science team. Qualifications Possess strong experience in business analysis and data analysis. Demonstrate expertise in generative AI and its applications in product development. Have a solid understanding of ITIL V4 practices and their implementation. Exhibit excellent communication and collaboration skills. Show proficiency in managing and leading a team of data professionals. Display a commitment to working from the office during day shifts.

Posted 2 weeks ago

Apply

3.0 years

5 - 7 Lacs

Hyderābād

On-site

Minimum qualifications: Bachelor’s degree in Engineering, Computer Science, a related field, or equivalent practical experience. Experience in system design or reading code (e.g., Java, C++, Python, etc.). Experience in technical project management, stakeholder management, professional services, solution engineering, or technical consulting. Preferred qualifications: Master’s degree in Engineering, Computer Science, Business, or a related field. 3 years of experience within the security space including, security engineering, security analytics, risk quantification/measurement, or technical risk management. Experience with data visualization solutions like Looker Studio, Tableau and Power BI. Experience with AI/ML and data analysis softwares like SQL, R, Python, Go. Knowledge of transforming ideation and manual processes to technical solutions. About the job As a Technical Solutions Consultant, you will be responsible for the technical relationship of our largest advertising clients and/or product partners. You will lead cross-functional teams in Engineering, Sales and Product Management to leverage emerging technologies for our external clients/partners. From concept design and testing to data analysis and support, you will oversee the technical execution and business operations of Google's online advertising platforms and/or product partnerships. You will be able to balance business and partner needs with technical constraints, develop innovative, cutting edge solutions and act as a partner and consultant to those you are working with. You will also be able to build tools and automate products, oversee the technical execution and business operations of Google's partnerships, as well as develop product strategy and prioritize projects and resources. Google creates products and services that make the world a better place, and gTech’s role is to help bring them to life. Our teams of trusted advisors support customers globally. Our solutions are rooted in our technical skill, product expertise, and a thorough understanding of our customers’ complex needs. Whether the answer is a bespoke solution to solve a unique problem, or a new tool that can scale across Google, everything we do aims to ensure our customers benefit from the full potential of Google products. To learn more about gTech, check out our video. Responsibilities Help and maintain relationships with stakeholders in customer or partner organizations to deliver or manage quality technical solutions and services. Contribute to Product Requirement Documents (PRDs) to record product specifications, validate PRDs to ensure customer/partner and internal needs are met, with some guidance. Help scale existing or create repeatable solutions (e.g., best practices recommendations, tutorials, blog articles, sample code) and ensure documentation of solutions with some guidance. Help write solution code in collaboration with internal or external developers, users, partners, clients, or stakeholders. Collaborate with internal and external stakeholders in their respective process lifecycle to provide technical guidance or identify possible existing or new technical solution offerings to help build the outcomes with some guidance. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form.

Posted 2 weeks ago

Apply

12.0 - 15.0 years

55 - 60 Lacs

Ahmedabad, Chennai, Bengaluru

Work from Office

Dear Candidate, We are hiring a BI Developer to transform raw data into meaningful insights through dashboards and reports that guide business decisions. Key Responsibilities: Develop and maintain BI dashboards and visualizations. Build data models and define KPIs for reporting. Extract, clean, and transform data from multiple sources. Optimize data queries for speed and accuracy. Work with stakeholders to define business metrics and reporting needs. Required Skills & Qualifications: Expertise in BI tools (Power BI, Tableau, Qlik). Proficiency in SQL and data modeling techniques. Experience with ETL development and data warehousing. Understanding of business processes (finance, sales, operations). Strong analytical and communication skills. Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Srinivasa Reddy Kandi Delivery Manager Integra Technologies

Posted 2 weeks ago

Apply

2.0 years

6 - 10 Lacs

Gurgaon

On-site

Gurugram Full Time About Klook We are Asia’s leading platform for experiences and travel services, and we believe that we can help bring the world closer together through experiences . Founded in 2014 by 3 avid travelers, Ethan Lin, Eric Gnock Fah and Bernie Xiong, Klook inspires and enables more moments of joy for travelers with over half a million curated quality experiences ranging from the biggest attractions to paragliding adventures, iconic museums to rich cultural tours, and other convenient local travel services across 2,700 destinations around the world. Do you share our belief in the wonders of travel? Our international community of over 1,800 employees, based in 30+ locations, certainly do! Global citizens ourselves, Klookers are not only curating memorable experiences for others but also co-creating our world of joy within Klook. We work hard and play hard, upkeeping our high-performing culture as we are guided daily by our 6 core values: Customer First Push Boundaries Critical Thinking Build for Scale Less is More Win as One We never settle, and together, we believe in achieving greater heights and realizing endless possibilities ahead of us in the dynamic new era of travel. Care to be a part of this revolution? Join us! What you’ll do : Collaborate with internal stakeholders (pricing, product, sales, marketing) and global teams to identify key business questions and translate them into actionable analytics project Perform data-driven analysis, leveraging insights from large datasets, improving revenue coverage at vertical and regional level, to support both local and global decision-making Automate manual operational processes and present back on time savings gained through modernization of business operations Design and maintain dashboards, reports, and performance metrics to track key performance indicators (KPIs) and provide data-driven insights to leadership Align closely with global teams to ensure consistency in reporting and analytics practices across regions Conduct deep-dive analyses to explore new opportunities for business growth, while supporting regional teams with data that addresses local market need Ensure data accuracy, integrity, and security by adhering to industry-standard protocols and compliance regulation Ability to communicate technical and complex information clearly and engage stakeholders at all levels A desire to mentor and develop junior team members in data analysis and SQL best practices, fostering a culture of continuous improvement What you’ll need: Bachelor’s or Master’s degree in Data Science, Statistics, Computer Science, or a related field Minimum of 2-3 years experience as a data analyst. Pricing background will be a bonus though not essential Advanced proficiency in SQL for data extraction and analysis Experience with BI tools like Looker, Tableau etc. with the ability to design compelling dashboards Familiarity with programming languages like Python Comfortable with stakeholder management – preferably in an international environment. Proven ability to work independently and lead initiatives in a fast-paced environment. Proactive and open mindset to identify and develop reporting needs that are currently unfulfilled within the business. Klook is proud to be an equal opportunity employer. We hire talented and passionate people of all backgrounds. We believe that a joyful workplace is an inclusive workplace, one where employees from all walks of life have an equal opportunity to thrive. We’re dedicated to creating a welcoming and supportive culture where everyone belongs.

Posted 2 weeks ago

Apply

0 years

2 - 3 Lacs

Mohali

On-site

Job Title: Digital Marketing Coordinator Location: Mohali, India Department: Analytics & Insights Job Type: Full-Time About Us: TRU is a Global Leading organisation dedicated to leveraging cutting-edge technology to drive business innovation and growth. We're architects of online experiences, innovators in the digital landscape, and partners in our clients' success stories. Our journey began with a simple yet powerful vision — to transform businesses through strategic and creative digital solutions. At TRU, we pride ourselves on a holistic approach to digital excellence. We don't just create websites or run marketing campaigns; we craft immersive digital journeys that resonate with audiences. From the inception of an idea to its execution, we bring together a team of passionate professionals who thrive on pushing boundaries and challenging the status quo. Our global team comprises of industry experts hailing from Canada and APAC realms, including India and Indonesia, having creative and innovative minds. We are tech-savvy enthusiasts and bring a wealth of intelligence and expertise to the table. Whether it's web development, design, digital marketing, or emerging technologies, we're here to navigate the complexities and deliver solutions that make a lasting impact. Position Overview : We are seeking a self-driven, ambitious, and tech-savvy Digital Marketing Coordinator to join our growing team. In this role, you will play a pivotal part in driving revenue growth, customer acquisition, and brand awareness through data-backed digital strategies. You will be responsible for campaign execution, performance optimization, and reporting across multiple digital platforms. Key Responsibilities: Own the end-to-end process of digital campaign execution, from planning and setup to performance tracking and reporting. Proactively troubleshoot tracking and reporting issues using tools like Google Tag Manager and Google Analytics. Develop, refine, and maintain visual performance dashboards in Looker Studio for internal stakeholders. Implement streamlined processes to ensure efficient, accurate, and scalable campaign measurement. Monitor campaign performance across Google Ads, Meta (Facebook/Instagram), and other digital platforms to optimize ROI and achieve KPIs. Identify emerging trends and customer insights, and translate them into actionable recommendations. Analyze sales funnels, web analytics, and behaviour trends to inform growth strategies. Research and implement new tools and technologies to support digital campaign execution and reporting. Report regularly on core marketing KPI,s including ROAS, CPL, conversion rates, website traffic, and engagement metrics. Requirements :- Strong hands-on experience with Google Analytics, Google Tag Manager, Looker Studio, BigQuery, and Google Ads. Proficiency in managing and optimizing Facebook Ads and Instagram campaigns. Solid understanding of digital marketing best practices, including SEO, SEM, Social Media Marketing, Email Marketing, and Social Media Optimization. Digital-first mindset with an ability to interpret data into strategic insights. Exceptional analytical and critical thinking skills with a results-driven attitude. Ability to manage multiple campaigns and deadlines with minimal supervision. Strong verbal and written communication skills in English. A strong portfolio and/or track record of managing high-performance digital campaigns. What We Offer: Competitive salary and benefits package. Opportunities for professional growth and development. A collaborative and innovative work environment.

Posted 2 weeks ago

Apply

0 years

1 - 2 Lacs

Cuttack

On-site

We are seeking a talented and detail-oriented Data Analyst to join our team. The ideal candidate will have a strong analytical mindset, excellent problem-solving skills, and proficiency in data visualization tools such as Looker Studio. Additionally, the candidate should have experience in Google Apps Script to automate data processes and enhance data analysis capabilities. The Data Analyst will play a crucial role in interpreting data, generating insights, and providing actionable recommendations to drive business decisions. Responsibilities: 1. Extract, clean, and analyze data from various sources to identify trends, patterns, and insights. 2. Develop and maintain reports, dashboards, and visualizations using Looker Studio to present data in a clear and concise manner. 3. Collaborate with cross-functional teams to understand business requirements and translate them into data analysis and reporting solutions. 4. Utilize Google Apps Script to automate repetitive tasks, streamline data processes, and enhance data analysis capabilities. 5. Perform ad-hoc analysis and data mining to support business initiatives and strategic decision-making. 6. Identify areas for process improvement and optimization based on data-driven insights. 7. Ensure data accuracy, consistency, and integrity by implementing data quality checks and validation processes. 8. Stay up-to-date with industry trends, best practices, and emerging technologies in data analytics and visualization. 9. Communicate findings and recommendations to stakeholders in a clear and concise manner through written reports and presentations. 10. Collaborate with IT and engineering teams to optimize data infrastructure and ensure data accessibility and reliability. Qualifications: Bachelor's degree in Computer Science, Statistics, Mathematics, Economics, or related field; Master's degree preferred. Proven experience as a Data Analyst or similar role, with expertise in data analysis, visualization, and interpretation. Proficiency in Looker Studio or similar data visualization tools. Strong programming skills, particularly in Google Apps Script, Python, or R. Experience working with SQL for data extraction and manipulation. Familiarity with statistical analysis techniques and machine learning concepts is a plus. Excellent analytical and problem-solving skills, with the ability to translate complex data into actionable insights. Strong attention to detail and accuracy in all work tasks. Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Ability to work independently and manage multiple projects simultaneously in a fast-paced environment. Job Type: Full-time Pay: ₹15,000.00 - ₹18,000.00 per month Benefits: Provident Fund Application Question(s): What is your current monthly in hand salary ? What is your expected monthly in hand salary? Are you comfortable in 15k-18k monthly in hand salry? Do you know advanced excel (VLOOK UP, PIVOT, FORMULAS)? Do you have experience in google apps script? If yes then how many years? How many years of experience do you have in Data Studio? What is your age? Are you ok for Cantonment Road, Cuttack, Odisha location? Education: Secondary(10th Pass) (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP Looker, Google cloud build, cloud run, Vertex AI, Airflow, TensorFlow, etc., Experience in Train, Build and Deploy ML, DL Models Experience in HuggingFace, Chainlit, React Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Building and deploying Models (Scikit learn, DataRobots, TensorFlow PyTorch, etc.) Developing and deploying On-Prem & Cloud environments Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Experience in LLM models like PaLM, GPT4, Mistral (open-source models), Work through the complete lifecycle of Gen AI model development, from training and testing to deployment and performance monitoring. Developing and maintaining AI pipelines with multimodalities like text, image, audio etc. Have implemented in real-world Chat bots or conversational agents at scale handling different data sources. Experience in developing Image generation/translation tools using any of the latent diffusion models like stable diffusion, Instruct pix2pix. Expertise in handling large scale structured and unstructured data. Efficiently handled large-scale generative AI datasets and outputs. Familiarity in the use of Docker tools, pipenv/conda/poetry env Comfort level in following Python project management best practices (use of cxzsetup.py, logging, pytests, relative module imports,sphinx docs,etc.,) Familiarity in use of Github (clone, fetch, pull/push,raising issues and PR, etc.,) High familiarity in the use of DL theory/practices in NLP applications Comfort level to code in Huggingface, LangChain, Chainlit, Tensorflow and/or Pytorch, Scikit-learn, Numpy and Pandas Comfort level to use two/more of open source NLP modules like SpaCy, TorchText, fastai.text, farm-haystack, and others Knowledge in fundamental text data processing (like use of regex, token/word analysis, spelling correction/noise reduction in text, segmenting noisy unfamiliar sentences/phrases at right places, deriving insights from clustering, etc.,) Have implemented in real-world BERT/or other transformer fine-tuned models (Seq classification, NER or QA) from data preparation, model creation and inference till deployment Use of GCP services like BigQuery, Cloud function, Cloud run, Cloud Build, VertexAI, Good working knowledge on other open source packages to benchmark and derive summary Experience in using GPU/CPU of cloud and on-prem infrastructures Skillset to leverage cloud platform for Data Engineering, Big Data and ML needs. Use of Dockers (experience in experimental docker features, docker-compose, etc.,) Familiarity with orchestration tools such as airflow, Kubeflow Experience in CI/CD, infrastructure as code tools like terraform etc. Kubernetes or any other containerization tool with experience in Helm, Argoworkflow, etc., Ability to develop APIs with compliance, ethical, secure and safe AI tools. Good UI skills to visualize and build better applications using Gradio, Dash, Streamlit, React, Django, etc., Deeper understanding of javascript, css, angular, html, etc., is a plus. Responsibilities Design NLP/LLM/GenAI applications/products by following robust coding practices, Explore SoTA models/techniques so that they can be applied for automotive industry usecases Conduct ML experiments to train/infer models; if need be, build models that abide by memory & latency restrictions, Deploy REST APIs or a minimalistic UI for NLP applications using Docker and Kubernetes tools Showcase NLP/LLM/GenAI applications in the best way possible to users through web frameworks (Dash, Plotly, Streamlit, etc.,) Converge multibots into super apps using LLMs with multimodalities Develop agentic workflow using Autogen, Agentbuilder, langgraph Build modular AI/ML products that could be consumed at scale. Qualifications Education : Bachelor’s or Master’s Degree in Computer Science, Engineering, Maths or Science Performed any modern NLP/LLM courses/open competitions is also welcomed.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About the Company JMAN Group is a growing technology-enabled management consultancy that empowers organizations to create value through data. Founded in 2010, we are a team of 450+ consultants based in London, UK, and a team of 300+ engineers in Chennai, India. Having delivered multiple projects in the US, we are now opening a new office in New York to help us support and grow our US client base. We approach business problems with the mindset of a management consultancy and the capabilities of a tech company. We work across all sectors, and have in depth experience in private equity, pharmaceuticals, government departments and high-street chains. Our team is as cutting edge as our work. We take pride for ourselves on being great to work with – no jargon or corporate-speak, flexible to change and receptive of feedback. We have a huge focus on investing in the training and professional development of our team, to ensure they can deliver high quality work and shape our journey to becoming a globally recognised brand. The business has grown quickly in the last 3 years with no signs of slowing down. About the Role 7+ years of experience in managing Data & Analytics service delivery, preferably within a Managed Services or consulting environment. Responsibilities Serve as the primary owner for all managed service engagements across all clients, ensuring SLAs and KPIs are met consistently. Continuously improve the operating model, including ticket workflows, escalation paths, and monitoring practices. Coordinate triaging and resolution of incidents and service requests raised by client stakeholders. Collaborate with client and internal cluster teams to manage operational roadmaps, recurring issues, and enhancement backlogs. Lead a >40 member team of Data Engineers and Consultants across offices, ensuring high-quality delivery and adherence to standards. Support transition from project mode to Managed Services – including knowledge transfer, documentation, and platform walkthroughs. Ensure documentation is up to date for architecture, SOPs, and common issues. Contribute to service reviews, retrospectives, and continuous improvement planning. Report on service metrics, root cause analyses, and team utilization to internal and client stakeholders. Participate in resourcing and onboarding planning in collaboration with engagement managers, resourcing managers and internal cluster leads. Act as a coach and mentor to junior team members, promoting skill development and strong delivery culture. Qualifications ETL or ELT: Azure Data Factory, Databricks, Synapse, dbt (any two – Mandatory). Data Warehousing: Azure SQL Server/Redshift/Big Query/Databricks/Snowflake (Anyone - Mandatory). Data Visualization: Looker, Power BI, Tableau (Basic understanding to support stakeholder queries). Cloud: Azure (Mandatory), AWS or GCP (Good to have). SQL and Scripting: Ability to read/debug SQL and Python scripts. Monitoring: Azure Monitor, Log Analytics, Datadog, or equivalent tools. Ticketing & Workflow Tools: Freshdesk, Jira, ServiceNow, or similar. DevOps: Containerization technologies (e.g., Docker, Kubernetes), Git, CI/CD pipelines (Exposure preferred). Required Skills Strong understanding of data engineering and analytics concepts, including ELT/ETL pipelines, data warehousing, and reporting layers. Experience in ticketing, issue triaging, SLAs, and capacity planning for BAU operations. Hands-on understanding of SQL and scripting languages (Python preferred) for debugging/troubleshooting. Proficient with cloud platforms like Azure and AWS; familiarity with DevOps practices is a plus. Familiarity with orchestration and data pipeline tools such as ADF, Synapse, dbt, Matillion, or Fabric. Understanding of monitoring tools, incident management practices, and alerting systems (e.g., Datadog, Azure Monitor, PagerDuty). Strong stakeholder communication, documentation, and presentation skills. Experience working with global teams and collaborating across time zones.

Posted 2 weeks ago

Apply

8.0 years

4 - 8 Lacs

Noida

On-site

Job Description Job ID MGRSO013625 Employment Type Regular Work Style on-site Location Noida,UP,India Role Mgr. Software Engineering-Java AI Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we’re only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieve? Read on. At UKG, you get more than just a job. You get to work with purpose. Our team of U Krewers are on a mission to inspire every organization to become a great place to work through our award-winning HR technology built for all. Here, we know that you’re more than your work. That’s why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose — a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you’re passionate about our purpose — people —then we can’t wait to support whatever gives you purpose. We’re united by purpose, inspired by you. We are seeking a seasoned Engineering Manager (M3 Level) to join our dynamic AI team. As a manager, you will lead a team of talented engineers, driving technical excellence, fostering a culture of ownership, and ensuring the successful delivery of high-impact complex projects. You will be responsible for guiding technical decisions, managing team performance, and aligning engineering efforts with business goals. Responsibilities: Technical Leadership: Experience in AI, modeling and LLM domain with hand on experience with tools like looker, cognos and other analytics tools Provide technical leadership and direction for major projects, ensuring alignment with business goals and industry best practices. You are hands-on with code and can jump in to help the team. You maintain high technical bar, closely involved in design and architecture decisions, code reviews, helping your engineers to optimize their code, maintaining high standards of code quality. Ensure that high standards of performance, scalability, and reliability are maintained when architecting, designing, and developing complex software systems and applications. Ensure accountability for the team’s technical decisions and enforce engineering best practices (e.g., documentation, automation, code management, security principles, leverage CoPilot). Play a pivotal role in the R.I.D.E. (Review, Inspect, Decide, Execute) framework. Understand CI/CD pipelines from build, test, to deploy phases. Team Management: Foster a culture of service ownership and enhance team engagement. Drive succession planning and engineering efficiency, focusing on quality and developer experience through data-driven approaches. Promote a growth mindset, understanding and driving organizational change. Actively seek opportunities for team growth and cross-functional collaboration. Lead cross-functional teams to design, develop, and deliver high-impact software projects on time and within budget. Actively seek opportunities for the team to grow within and across the organization. Works and guides the team on how to operate in a DevOps Model. Taking ownership from working with product management on requirements to design, develop, test, deploy and maintain the software in production. Coaching and Development: Grow and develop the team technically and with a quality mindset, providing strong and actionable feedback. Provide technical mentorship and guidance to engineers at all levels, fostering a culture of learning, collaboration, and continuous improvement. Execution Excellence: Manage team workload and capacity, setting priorities and managing risks and tradeoffs. Align team efforts with the strategic direction of the company, understanding the big picture and business needs. Demonstrate engineering excellence and service ownership, including cost and quality management of services, and effective production management. Collaborate effectively with stakeholders across Product, TPM, Security, Cloud, and other teams. Make deployment decisions with appropriate risk mitigation. Qualifications: Bachelor's degree or equivalent practical experience. 8 years of experience with software development with couple of years in AI and ML 5 years of experience in a technical leadership role, overseeing projects, with 2 years of experience in a people management, supervision/team leadership role. 5 years of experience building and developing infrastructure or distributed systems. Experience in SAAS Enterprises Proven experience in leading engineering teams and driving technical decisions. Strong understanding of engineering best practices and service-oriented architecture. Excellent communication and leadership skills, with a demonstrated ability to influence and drive change. Experience with cloud technologies (Azure, AWS, GCP) and CI/CD pipelines. Proven track record of managing high-impact projects and delivering results. Where we’re going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it’s our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKG is proud to be an equal opportunity employer and is committed to promoting diversity and inclusion in the workplace, including the recruitment process. Disability Accommodation in the Application and Interview Process For individuals with disabilities that need additional assistance at any point in the application and interview process, please email UKGCareers@ukg.com

Posted 2 weeks ago

Apply

1.0 - 3.0 years

1 - 3 Lacs

Noida

On-site

Key Responsibilities: Plan, execute, and optimize paid campaigns across Google, Meta, Bing, and Yahoo platforms. Develop customized B2B advertising strategies to generate quality leads and increase client acquisition. Conduct keyword research, audience targeting, and competitor analysis to refine campaigns. Monitor daily performance metrics to understand effectiveness and identify growth opportunities. Manage budgets efficiently to maximize ROI and reduce cost-per-lead (CPL). Collaborate with sales and content teams to align ad messaging with the buyer’s journey. Create and present performance reports with actionable insights to internal teams or clients. Stay updated on platform changes, trends, and industry best practices. Required Skills & Qualifications: Proven experience (1–3 years) in managing PPC campaigns across Google, Meta, Yahoo, and Bing. Deep understanding of B2B marketing funnels and audience targeting. Excellent communication and presentation skills; client-facing experience preferred. Hands-on experience with ad tools such as Google Ads Manager, Meta Business Suite, Microsoft Ads, etc. Strong analytical mindset with knowledge of Google Analytics, Looker Studio (formerly Data Studio), etc. Google Ads and Meta Certifications (preferred). Shift-10am-7pm Working Mon-fri Location-NSEZ Noida Sec 81 Interested candidate can WhatsApp their resume 9330458358 Job Type: Full-time Pay: ₹10,000.00 - ₹25,000.00 per month Schedule: Day shift Ability to commute/relocate: Noida, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: of running Ads: 1 year (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Department: Customer Success / Client Solutions Reports to: Director/VP of Customer Success About the Role: We are looking for a Senior Technical Customer Success Manager to join our growing team. This is a client-facing role focused on ensuring successful adoption and value realization of our SaaS solutions. The ideal candidate will come from a strong analytics background, possess hands-on skills in SQL and Python or R , and have experience working with dashboarding tools . Prior experience in eCommerce or retail domains is a strong plus. Responsibilities: Own post-sale customer relationship and act as the primary technical point of contact Drive product adoption and usage through effective onboarding, training, and ongoing support Work closely with clients to understand business goals and align them with product capabilities Collaborate with internal product, engineering, and data teams to deliver solutions and enhancements tailored to client needs Analyze customer data and usage trends to proactively identify opportunities and risks Build dashboards or reports for customers using internal tools or integrations Lead business reviews, share insights, and communicate value delivered Support customers in configuring rules, data integrations, and troubleshooting issues Drive renewal and expansion by ensuring customer satisfaction and delivering measurable outcomes Requirements: 7+ years of experience in a Customer Success, Technical Account Management, or Solution Consulting role in a SaaS or software product company Strong SQL skills and working experience with Python or R Experience with dashboarding tools such as Tableau, Power BI, Looker, or similar Understanding of data pipelines, APIs, and data modeling Excellent communication and stakeholder management skills Proven track record of managing mid to large enterprise clients Experience in eCommerce, retail, or consumer-facing businesses is highly desirable Ability to translate technical details into business context and vice versa Bachelor’s or Master’s degree in Computer Science, Analytics, Engineering, or related field Nice to Have: Exposure to machine learning workflows, recommendation systems, or pricing analytics Familiarity with cloud platforms (AWS/GCP/Azure) Experience working with cross-functional teams in Agile environments Powered by JazzHR R7ZogzGRkQ

Posted 2 weeks ago

Apply

17.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Mission Healthcare should work for patients, but it doesn’t. In their time of need, they call down outdated insurance directories. Then wait on hold. Then wait weeks for the privilege of a visit. Then wait in a room solely designed for waiting. Then wait for a surprise bill. In any other consumer industry, the companies delivering such a poor customer experience would not survive. But in healthcare, patients lack market power. Which means they are expected to accept the unacceptable. Zocdoc’s mission is to give power to the patient. To do that, we’ve built the leading healthcare marketplace that makes it easy to find and book in-person or virtual care in all 50 states, across +200 specialties and +12k insurance plans. By giving patients the ability to see and choose, we give them power. In doing so, we can make healthcare work like every other consumer sector, where businesses compete for customers, not the other way around. In time, this will drive quality up and prices down. We’re 17 years old and the leader in our space, but we are still just getting started. If you like solving important, complex problems alongside deeply thoughtful, driven, and collaborative teammates, read on. Your Impact On Our Mission As a Senior Marketing Operations Associate you’ll enable the Lifecycle Marketing Team to reach the right B2B audiences with the right message at the right time, thereby increasing the number of practices using Zocdoc, adopting our full suite of products, and garnering value from Zocdoc’s platform. You’ll enjoy this role if you... Enjoy independently untangling complex problems and finding efficient, scalable solutions Are curious and proactive—always looking for ways to make campaigns smarter, faster, or more personalized through data and automation Get satisfaction from bringing order to chaos—whether it’s cleaning up messy data or streamlining a process Thrive in a cross-functional environment and loves being the connective tissue between data, tools, and people Find joy in the details and takes pride in delivering polished, error-free work Are dynamic and adaptive and prefer working in fast paced, regularly changing environments Your day to day is… Audience Sizing: Partnering with Lifecycle Marketing Managers, Commercial Analytics, Sales, and Product Teams to determine campaign qualifiers enabling us to predict potential impact and help inform campaign strategy. Audience Creation: Extracting, manipulating, and combining data from various sources such as Cistern, Salesforce, Marketo, Braze and Amplitude to create campaign audiences Assisting in audience segmentation based on shared characteristics such as job title, specialty, practice location, product selection, practice tenure, EHR, product login and behavioral activity Providing data for campaign personalization purposes. This will require data scrubbing to ensure the data can be used for customer-facing purposes. Campaign Production: Build and deploy email marketing campaigns end-to-end, including ad hoc sends and multi-step nurture or drip journeys. Creating and managing personalized email campaigns that dynamically adapt to customer data and behavior (to be included over time) Assisting with QAing all campaigns to ensure flawless execution and 100% accuracy across copy, links, tracking, and dynamic content Data Hygiene: Helping maintain high data quality standards by cleaning audience data for customer-facing use, identifying and removing stale or invalid contacts, and implementing best practices to support email deliverability and engagement. Performance Reporting: Helping build dashboards that provide clear, actionable reporting on campaign performance across channels (with an emphasis on email) Process Optimization: Analyzing current marketing workflows, particularly those related to audience creation, identify inefficiencies, and recommend improvements to increase campaign velocity and impact Cross-functional Collaboration: Working closely with Marketing, Data, and Sales Ops teams to ensure seamless campaign execution and accurate data flow across systems You’d be successful in this role if you have… 2–4 years of experience in data analysis and manipulation, ideally for marketing in a B2B environment Proven ability to define, size, and segment audiences using multiple data sources and customer attributes Strong analytical reasoning skills. Ability to understand the relevant data, ask the right questions, and maintain strong data literacy regarding data definitions Advanced proficiency in querying and manipulating data using tools like Gsheets, Excel & Snowflake. SQL preferred. Experience querying CRMs like Salesforce also preferred Demonstrated interest in learning: Email marketing principles and best practices to ensure email messages are properly constructed to meet industry standards, marketing automation platforms like Adobe Marketo, Braze and Customer Relationship Management Tools like Salesforce CDPs etc. Salesforce experience (preferred) Best practices for data hygiene, list management, and email deliverability Exceptional attention to detail, especially in QAing campaigns and validating data for customer-facing use Familiarity with reporting tools like Looker, Snowflake, or Amplitude and experience designing reporting dashboards in Excel or similar tool A Bachelor’s degree from a top tier institution with a minimum of 60% in 10th/12th/ graduation Strong project management skills Excellent verbal and written communication skills Independent problem-solving capabilities Benefits An incredible team of smart and supportive people A competitive compensation package, including attractive medical insurance Amazing perks – think catered lunch every day, Ping Pong, etc. Daycare/creche facility for kids The chance to create a better healthcare experience for millions of patients! Corporate wellness programs with Headspace Cellphone and wifi reimbursement Competitive parental leave Sabbatical leave (over 5 years) Annual sponsored health check-ups Zocdoc is certified as a Great Place to Work 2022-2024 About Us Zocdoc is the country’s leading digital health marketplace that helps patients easily find and book the care they need. Each month, millions of patients use our free service to find nearby, in-network providers, compare choices based on verified patient reviews, and instantly book in-person or video visits online. Providers participate in Zocdoc’s Marketplace to reach new patients to grow their practice, fill their last-minute openings, and deliver a better healthcare experience. Founded in 2007 with a mission to give power to the patient, our work each day in pursuit of that mission is guided by our six core values. Zocdoc is a private company backed by some of the world’s leading investors, and we believe we’re still only scratching the surface of what we plan to accomplish. Zocdoc is a mission-driven organization dedicated to building teams as diverse as the patients and providers we aim to serve. In the spirit of one of our core values - Together, Not Alone , we are a company that prides itself on being highly collaborative, and we believe that diverse perspectives, experiences and contributors make our community and our platform better. We’re an equal opportunity employer committed to providing employees with a work environment free of discrimination and harassment. Applicants are considered for employment regardless of race, color, ethnicity, ancestry, religion, national origin, gender, sex, gender identity, gender expression, sexual orientation, age, citizenship, marital or parental status, disability, veteran status, or any other class protected by applicable laws. Job Applicant Privacy Notice

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies