Jobs
Interviews

3228 Looker Jobs - Page 24

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Google Cloud Data Services, Microsoft SQL Server Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and Modelor Project Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making. Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects 3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications 5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQL Technical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environment Professional Attributes : 1: Must have good communication skills 2: Must have ability to collaborate with different teams and suggest solutions 3: Ability to work independently with little supervision or as a team 4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Google BigQuery Good to have skills : Microsoft SQL Server, Google Cloud Data Services Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary:As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Your role will be pivotal in enhancing the overall data infrastructure and ensuring that data is accessible and reliable for decision-making purposes. Project Role : Analytics and ModelorProject Role Description : Analyze and model client, market and key performance data. Use analytical tools and techniques to develop business insights and improve decision-making.Must have Skills : Google BigQuery, SSI: NON SSI: Good to Have Skills :SSI: No Technology Specialization NON SSI : Job Requirements : Key Responsibilities : Data Proc, Pub,Sub, Data flow, Kalka Streaming, Looker, SQL (No FLEX) 1:Proven track record of delivering data integration, data warehousing soln 2: Strong SQL And Hands-on (No FLEX) 2:Exp with data integration and migration projects3:Proficient in BigQuery SQL language (No FLEX) 4:understanding on cloud native services : bucket storage, GBQ, cloud function, pub sub, composer, and Kubernetes Exp in cloud solutions, mainly data platform services , GCP Certifications5: Exp in Shell Scripting, Python (NO FLEX), Oracle, SQLTechnical Experience : 1: Expert in Python (NO FLEX). Strong hands-on and strong knowledge in SQL(NO FLEX), Python programming using Pandas, NumPy, deep understanding of various data structure dictionary, array, list, tree etc, experiences in pytest, code coverage skills are preferred 2: Strong hands-on experience with building solutions using cloud native services: bucket storage, Big Query, cloud function, pub sub, composer, and Kubernetes etc. (NO FLEX) 3: Proficiency with tools to automate AZDO CI CD pipelines like Control-M , GitHub, JIRA, confluence , CI CD Pipeline 4: Open mindset, ability to quickly adapt new technologies 5: Performance tuning of BigQuery SQL scripts 6: GCP Certified preferred 7: Working in agile environmentProfessional Attributes : 1: Must have good communication skills2: Must have ability to collaborate with different teams and suggest solutions3: Ability to work independently with little supervision or as a team4: Good analytical problem solving skills 5: Good team handling skills Educational Qualification: 15 years of Full time education Additional Information : Candidate should be ready for Shift B and work as individual contributor

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Organizations everywhere struggle under the crushing costs and complexities of “solutions” that promise to simplify their lives. To create a better experience for their customers and employees. To help them grow. Software is a choice that can make or break a business. Create better or worse experiences. Propel or throttle growth. Business software has become a blocker instead of ways to get work done. There’s another option. Freshworks. With a fresh vision for how the world works. At Freshworks, we build uncomplicated service software that delivers exceptional customer and employee experiences. Our enterprise-grade solutions are powerful, yet easy to use, and quick to deliver results. Our people-first approach to AI eliminates friction, making employees more effective and organizations more productive. Over 72,000 companies, including Bridgestone, New Balance, Nucor, S&P Global, and Sony Music, trust Freshworks’ customer experience (CX) and employee experience (EX) software to fuel customer loyalty and service efficiency. And, over 4,500 Freshworks employees make this possible, all around the world. Fresh vision. Real impact. Come build it with us. Job Description Overview We’re seeking a strategic and insights-driven Operations Manager to join our Talent Acquisition (TA) team. In this role, you will lead the design, implementation, and evolution of TA analytics, dashboards, and reporting frameworks. You’ll partner with cross-functional teams across TA, HR, Finance, and Business to deliver data-driven insights that enhance hiring effectiveness, recruiter productivity, and long-term workforce planning. As the analytics lead for TA, you will drive data interpretation, streamline reporting processes, and help shape recruiting strategies aligned with business goals, while also influencing operational improvements with measurable outcomes.. Roles & Responsibilities TA Metrics & Reporting Build and manage dashboards that track full-funnel recruiting performance: source-to-offer ratios, conversion rates, pipeline velocity, time-to-hire, position lifecycle tracking and DEI metrics. Automate recurring reports and build self-serve tools to empower recruiters, TA leaders, and business stakeholders. Standardize metric definitions and data governance to ensure reporting consistency across regions and functions. Incorporate TA capacity and bandwidth models to help optimize team utilization and forecast recruitment support needs. Data Interpretation & Storytelling Translate raw data into actionable insights for diverse audiences including recruiters, TA leadership, and business heads. Communicate hiring trends, risks, and opportunities using effective visualizations and executive-ready narratives. Support ongoing evaluation of recruiter performance, quality-of-hire, and hiring velocity using data-driven methods. System & Tool Integration Work across platforms like Greenhouse, Lever, SmartRecruiters, or Workday to extract, interpret, and visualize data. Partner with HRIS and TA Ops to ensure accurate ATS/HRIS data pipelines and overcome system limitations. Drive tool adoption by simplifying access to data and training end-users on dashboards and insights tools. Project Management & Optimization Own end-to-end delivery of analytics projects—from stakeholder scoping to final delivery and adoption. Gather, document, and prioritize reporting needs from TA, HRBPs, and business leaders in a structured and scalable way. Use data to identify inefficiencies in the recruiting process and collaborate with TA Ops to design and implement solutions. Prepare QBR resources and performance summaries for TA Leads aligned to their respective MGMT team members. Strategy Alignment & Change Management Ensure analytics frameworks support broader TA goals like headcount planning, DEI tracking, and recruiter capacity modeling. Communicate changes in metrics or reporting methodologies clearly to ensure buy-in and accurate usage. Support global reporting efforts while adhering to data privacy standards (e.g., GDPR, EEOC). Qualifications 6–10 years of experience in data analytics or business intelligence, with at least 3 years in Talent Acquisition or People Analytics. Proven ability to manage multiple stakeholders and deliver analytics solutions in a high-growth or fast-paced environment. Experience working with recruiting metrics, reporting tools, and applicant tracking systems. Bachelor's or Master's degree in Data Science, Statistics, Business Analytics, Engineering, or related fields. Key Skills Technical & Analytical Expertise in TA analytics, including source effectiveness, funnel conversion, pipeline velocity, and recruiter efficiency. Strong skills in Excel/Google Sheets (advanced formulas, pivot tables, modeling). Proficiency with BI tools (e.g., Tableau, Power BI, Looker) for dashboard development and data visualization. Working knowledge of SQL for querying ATS/HRIS data (preferred but not mandatory). Systems & Integration Familiarity with ATS/HRIS platforms like Greenhouse, Lever, SmartRecruiters, or Workday. Understanding of reporting limitations and data structures within recruiting systems. Ability to drive dashboard automation and build scalable, self-serve tools for recruiting teams. Project & Process Management Skilled in managing cross-functional analytics projects end-to-end. Experience in requirements gathering, timeline setting, and prioritization in a fast-paced environment. Ability to identify bottlenecks in TA workflows and implement data-backed improvements. Communication & Business Acumen Strong storytelling and stakeholder engagement skills—comfortable presenting to recruiters, hiring managers, and execs. Understanding of how analytics drives TA goals such as DEI, recruiter productivity, and headcount planning. Ability to work independently in ambiguous environments and translate business challenges into data solutions. Bonus / Differentiators Experience with predictive analytics (e.g., hiring forecasts, attrition modeling). Exposure to global hiring data and cross-regional reporting frameworks. Knowledge of data privacy regulations in recruitment analytics (e.g., GDPR, EEOC). Additional Information At Freshworks, we are creating a global workplace that enables everyone to find their true potential, purpose, and passion irrespective of their background, gender, race, sexual orientation, religion and ethnicity. We are committed to providing equal opportunity for all and believe that diversity in the workplace creates a more vibrant, richer work environment that advances the goals of our employees, communities and the business.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Role & Responsibilities: Execute & Operate High-return campaigns on Google, Meta, Linkedin and more with proficiency in both ecommerce & lead generation campaigns. Proven track record & success while spending a minimum monthly >INR 2 Lakhs per client. The experience & ability to multi-task and look over 4-5 accounts at the same time. The ability to analyze & report on data over various periods of time within the team and for clients as well Insight on performance and the ability to substantiate optimizations & improvements A Creative bent towards approaching Creatives & ad copies for clients A good hold on understanding consumer behavior, trends and best-case practices Versatile at management of diverse industries, including but not limited to D2C, Fashion & Real Estate Skills Requirement 3-5 years of experience in performance marketing, preferably in a digital marketing agency Knowledge of Dashboards & Conversion Tracking Tools such as Google Tag Manager, GA4, Meta Pixel, Meta Events manager(a plus point) Knowledge of tools like SEMrush, Moz, Looker Studio(a plus point) Leadership Qualities & Practicality Initiative & Eagerness to work If you are passionate to join us and eager to contribute your skills, we’d love to hear from you. For more information, visit: https://bregobusiness.com/

Posted 2 weeks ago

Apply

0.0 - 1.0 years

0 Lacs

Thane, Maharashtra

On-site

Job Information Date Opened 07/21/2025 Job Type Full time Industry FMCG/Foods/Beverage Work Experience 0-1 year City Thane State/Province Maharashtra Country India Zip/Postal Code 400601 About Us Plum is one of India’s science-first, vegan, premium beauty brand with a strong portfolio of skincare and haircare products. By smartly combining research-backed actives with the chemistry of botanical ingredients, Plum creates formulas that truly resonate. At the heart of everything, Plum has a simple message: 'We have chemistry - with each other, with our product & especially with our customer.’ The brand is driven by a strong leadership and investor team, focused on building value for people, the planet, and profit-sharing participants. Job Description Purpose – Why does it exist? The main purpose of this role is to maintain good data health and analytics practices in the company. This role would help the company adopt data-driven decision-making in all its functions . Key Performance Indicators Standardizing data practices for the company, creating dashboards and providing insights Key Responsibilities Gathering unstructured data from different departments in the company, collating and maintaining a data warehouse Building comprehensive reports and guiding different departments such as sales, marketing, supply chain and finance by identifying trends, formulating and testing hypotheses from data and providing actionable insights End-to-end problem solving for the business, right from identifying gaps/opportunities to proposing innovative changes Always supporting key functions by responding to ad-hoc data/dashboarding requests Potential future responsibilities: using sophisticated statistical techniques to solve business problems (predictive modelling, optimization algorithms, etc.) Experience & Qualification Preferred: 0-1 years of experience working on data manipulation (R, Python or similar), data visualisation (Power BI, Looker, etc) Mandatory: MS Excel, SQL Strong Problem Solving and Analytical Thinking Location: Thane, Mumbai (WFO)

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Dehradun, Uttarakhand, India

On-site

About Us: Naturally Pahadi is a Dehradun based D2C food brand. We are looking for an expert for digital growth, who can create our digital impact and boost our sales online. Experience : 2.00 + years Salary : based on experience Placement Type : Full Time Permanent position Location: Dehradun What do you need for this opportunity? Must have skills required: Google Ads,, Meta Ads Manager, Attention-to-Detail, Digital Advertising, Google Analytics, LinkedIn Ads About The Role We are looking for an experienced, data-driven digital marketing/advertising specialist responsible for developing and implementing digital advertising strategies. The ideal candidate will have a deep understanding of the customer journey, extensive experience with paid advertising channels. He/she should be confident in developing and implementing strategies and tactics that involve channels targeting users at various stages of their journey. This role requires working in a small team with tight budgets, handling overall creative development to implementation and evaluation. Requirements 2+ years’ experience in a digital advertising role Agency experience in paid digital advertising Experience working in Google Analytics 4, Looker Studio and other dashboard/reporting tools. Capabilities to develop and work with platforms like Shopify kind ecommerce website. Experience managing multiple paid campaigns simultaneously in Google Ads, Meta Ads Manager, LinkedIn ads and other ad platforms. Solid understanding of concepts like measurement planning, customer journeys, digital advertising strategies, lead generation Data-driven with the ability to interpret data and turn it into understandable, actionable insights Proven track record of working with teams in Creative, Design and Development Passion for the digital advertising industry and evolving ad platforms, policies and opportunities Excellent written and verbal communication skills Roles Digital campaign strategy planner Digital campaign implementer Digital subject matter expert Digital campaign optimizer Google Analytics advisor Key Accountabilities Development of paid campaign digital strategies Identify target audience segments on paid channels Recommend digital media budgets To be able to configure campaign conversion tracking Work with design team to optimize campaign landing pages Setup digital campaigns on all major paid channels including Google Ads, Facebook, Instagram, LinkedIn, and other ad platforms. Optimize campaigns on all paid channels to achieve maximum ROI Provide marketing/campaign performance reporting using Google Analytics and Looker Studio Provide insight and analyse data as a digital subject matter expert. Competencies Detail-oriented: Doesn’t miss details, follows plans, uses checklists Organised: Maintains Google Drive folder structure, ad campaign organization, Collaborative: Ability to work well with other people. No egos, just ideas. Strategic: Looks at problems from multiple angles to determine best approach Pragmatic: Finds the best way, all things considered Communication: Speaks and writes clearly and articulately without being overly verbose or talkative. Maintains this standard in all forms of written communications including email. Teamwork: Reaches out to peers and cooperates with team to establish an overall collaborative working relationship Calm under pressure: Maintains stable performance when under pressure or stress Flexibility/adaptability: Adjusts quickly to changing priorities and conditions. Copes effectively with complexity and change. Learns quickly. Proactive: Motivated self-starter, doesn’t wait for others, acts without being told what to do Assertive: Confident, takes charges Work ethic: Possesses a strong willingness to work hard and get the job done. Enthusiasm: Exhibits passion and excitement over work. Has a can-do attitude.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

On-site

Data Analyst (Data Visualization & Reporting) Key Responsibilities Work with large datasets from various inspection sources (LiDAR, drones, thermal imaging). Build insightful dashboards and reports using tools like Power BI, Tableau, or Looker. Develop and deploy predictive models and statistical analyses to detect anomalies and prevent failures. Collaborate with engineering and operations teams to translate complex data into operational insights. Ensure high-quality, clean, and consistent data by implementing validation pipelines. Apply basic electrical domain knowledge (fault detection, insulator/conductor analysis, etc.) for enriched interpretations. Continuously improve analysis workflows and automate repetitive data processes. Required Skills & Experience 3+ years of hands-on experience as a Data Analyst/Data Scientist. Strong skills in SQL, Python (Pandas, NumPy), or R for data manipulation. Proficiency in data visualization tools : Power BI, Tableau, Looker, etc. Experience in working with time-series data or sensor-based data from industrial sources. Exposure to predictive analytics, ML algorithms, or data modeling techniques. Solid understanding of data pipelines and best practices in data management. Familiarity with AWS/Azure/GCP for data processing is a plus. Background or familiarity with geospatial data or tools like QGIS is a bonus. Preferred Qualifications Degree in Data Science, Engineering, Computer Science, Prior experience with inspection data, IoT, or utilities/power transmission systems. Knowledge of domain-specific platforms used for power line inspections. Certification in data analysis/ML platforms (Google Data Analytics, Microsoft DA, etc. Soft Skills Strong analytical thinking and attention to detail. Ability to convert technical findings into business-focused insights. Team player with cross-functional collaboration experience. Effective written and verbal communication skills (ref:hirist.tech)

Posted 2 weeks ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Report this job Job Title: Ecommerce SME Analyst Summary We are seeking an experienced and driven Ecommerce SME Analyst with 8 years of expertise in digital analytics and ecommerce data. In this role, you will analyze clickstream and user behavior data to uncover actionable insights that enhance user experience, optimize conversion funnels, and inform strategic product decisions. You will work extensively with Adobe Analytics, Python, SQL, and BigQuery, and collaborate with cross-functional teams to drive data-informed growth across our ecommerce platform. Key Responsibilities Clickstream %2526 Ecommerce AnalysisAnalyze ecommerce clickstream data using Adobe Analytics to understand user journeys, identify drop-off points, and recommend optimizations for improved engagement and conversion. User Behavior InsightsSegment and analyze user behavior to uncover patterns, preferences, and opportunities for personalization and targeting. Data Extraction %2526 TransformationUse SQL and Python to query, clean, and transform large datasets from BigQuery and other data sources. Visualization %2526 ReportingBuild dashboards and reports using visualization tools (e.g., Tableau, Looker, Power BI) to communicate insights clearly to stakeholders. Product Strategy SupportPartner with product and analytics teams to translate data insights into actionable recommendations that shape the product roadmap. KPI Definition %2526 TrackingDefine and monitor key performance indicators (KPIs) to evaluate the impact of product features and site changes. A/B Testing AnalysisDesign and analyze A/B tests to assess the effectiveness of new features and user experience improvements. Cross-Functional CollaborationWork closely with product managers, marketers, and engineers to understand data needs and deliver timely, relevant insights. Data Quality AssuranceEnsure data accuracy and integrity through validation checks and collaboration with data engineering teams. Continuous LearningStay current with industry trends, tools, and best practices in ecommerce analytics and data science. Required Qualifications Bachelor's degree in Statistics, Mathematics, Computer Science, Economics, or a related field. Minimum 2 years of experience in ecommerce analytics. Strong hands-on experience with Adobe Analytics for tracking and analyzing user behavior. Proficiency in SQL and Python (including libraries like Pandas, NumPy, Matplotlib, Seaborn). Experience working with Google BigQuery or similar cloud-based data warehouses. Familiarity with data visualization tools (e.g., Tableau, Looker, Power BI). Strong analytical and problem-solving skills. Excellent communication skills to present findings to technical and non-technical audiences. Ability to work independently and collaboratively in a fast-paced environment. Key Skills Adobe Analytics Python (Pandas, NumPy, Matplotlib, Seaborn) SQL Google BigQuery Ecommerce Analytics Clickstream %2526 User Behavior Analysis Data Visualization %2526 Reporting A/B Testing Product Strategy %2526 KPI Tracking Communication %2526 Collaboration Data Quality %2526 Validation Key Details Job Function: IT Software : Software Products & Services Industry: IT-Software Specialization:Information Systems Employment Type: Full Time Key Skills Mandatory Skills : Adobe Analytics Python SQL bigquery ecommerce domain About Company Company:LTIMindtree Job Posted by Company LTIMindtree Ltd. LTIMindtree is a global technology consulting and digital solutions company that enables enterprises... More across the industries to reimagine business models, As a digital transformation partner to more than 750 clients, brings extensive domain and technology expertise to help drive superior competitive world. Powered by nearly 90,000 talented and entrepreneurial professionals across more than 30 countries, LTIMindtree - A Larsen & Toubro Group company - combines the industry - acclaimed strengths or erstwhile Larsen and Toubro Infotech and MindTree in solving the most complex business challenges and delivering transformation at scale. Less Job Id: 71587467

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Highspot Highspot is a software product development company and a recognized global leader in the sales enablement category, leveraging cutting-edge AI and GenAI technologies at the core of its robust Software-as-a-Service (SaaS) platform. Highspot is revolutionizing how millions of individuals work worldwide. Through its AI-powered platform, Highspot drives enterprise transformation to empower sales teams through intelligent content management, training, contextual guidance, customer engagement, meeting intelligence, and actionable analytics. The Highspot platform delivers advanced features tailored to business needs, in a modern design that sales and marketing executives appreciate and is the #1 rated sales enablement platform on G2 Crowd. While headquartered in Seattle, Highspot has expanded its footprint across America, Canada, the UK, Germany, Australia, and now India, solidifying its presence in the Asia Pacific markets. About The Role As a Senior Product Researcher at Highspot, you will influence decision-making at all levels to ensure we meet user needs, market expectations and business goals. Responsibilities Foster a user-centric and data-centric culture, ensuring that product development is grounded in a deep understanding of user needs, preferences and behaviors Scope and drive end-to-end research projects, from research road mapping, study planning, execution, analysis, and socialize insights that influence decisions. Leverage analytics dashboards to understand user behavior and drive business value. Proactively partner with leadership to influence our roadmap. Required Qualifications 5+ years conducting research in user experience, product design or product marketing. Expert understanding of multiple qualitative and quantitative research methodologies. Strong ability to lead complex projects with positive business outcomes. Proven record of championing user and data-centric culture across partners and stakeholders. Excellent communication, presentation and collaboration skills. Highly desired: Experience leveraging data analytics / behavioral data tools (e.g. Tableau, Looker, Amplitude, etc.) to craft holistic and contextual insights. Equal Opportunity Statement We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of age, ancestry, citizenship, color, ethnicity, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or invisible disability status, political affiliation, veteran status, race, religion, or sexual orientation. Did you read the requirements as a checklist and not tick every box? Don't rule yourself out! If this role resonates with you, hit the ‘apply’ button.

Posted 2 weeks ago

Apply

1.0 - 31.0 years

2 - 2 Lacs

Bara Bazar, Kolkata/Calcutta

On-site

Job Opening: Data Management Executive We're seeking a detail-oriented and analytical Data Management Executive to join our team! Key Responsibilities: - Develop and implement effective data management procedures - Create and monitor Flow Chart and FMS systems for office processes - Support executives/departments in daily data system usage - Monitor and evaluate information/data systems for analytical results - Manage incoming data files and develop data management strategies - Assist management decision-making with employee performance reports - Create Google Forms to track processes and extract data when needed Requirements: - Proficiency in Google Forms, AppScript, and Looker Studio (hands-on experience essential) - Strong analytical and problem-solving skills - Excellent communication and organizational skills What We Offer: - Competitive salary: ₹22,000 per month - Opportunity to work with a dynamic team - Professional growth and development opportunities.

Posted 2 weeks ago

Apply

0.0 - 1.0 years

0 Lacs

Thane, Maharashtra, India

On-site

Purpose – Why does it exist? The main purpose of this role is to maintain good data health and analytics practices in the company. This role would help the company adopt data-driven decision-making in all its functions . Key Performance Indicators Standardizing data practices for the company, creating dashboards and providing insights Key Responsibilities Gathering unstructured data from different departments in the company, collating and maintaining a data warehouse Building comprehensive reports and guiding different departments such as sales, marketing, supply chain and finance by identifying trends, formulating and testing hypotheses from data and providing actionable insights End-to-end problem solving for the business, right from identifying gaps/opportunities to proposing innovative changes Always supporting key functions by responding to ad-hoc data/dashboarding requests Potential future responsibilities: using sophisticated statistical techniques to solve business problems (predictive modelling, optimization algorithms, etc.) Experience & Qualification Preferred: 0-1 years of experience working on data manipulation (R, Python or similar), data visualisation (Power BI, Looker, etc) Mandatory: MS Excel, SQL Strong Problem Solving and Analytical Thinking Location: Thane, Mumbai (WFO)

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Who We Are: Alpaca is a US California headquartered brokerage infrastructure technology company and self-clearing broker-dealer, delivering execution and custody solutions for Stocks, ETFs, Options, Cryptocurrencies, and more, and has raised over $170 million in funding. Amongst our subsidiaries, Alpaca is a licensed financial services company in multiple countries, and we serve hundreds of financial institutions globally such as broker-dealers, investment advisors, hedge funds, and crypto exchanges. Alpaca's globally distributed team members bring in diverse experiences such as engineers, traders, and brokerage professionals to achieve our Mission of opening financial services to everyone on the planet. We are also deeply committed to open-source contributions and fostering a vibrant community. We will continue to enhance and improve our award-winning developer-friendly API and the infrastructure behind it. Our Team Members: We're a team of 200+ globally distributed members who love working from our favorite places worldwide. Our team spans the USA, Canada, Japan, Hungary, Nigeria, Brazil, the United Kingdom, and more! We're looking for candidates eager to join Alpaca's growing organization, who are excited about our Mission of "Open financial services to everyone on the planet and share our Values of "Stay Curious," "Have Empathy," and "Be Accountable." Your Role: Our Clearing Operations team at Alpaca is seeking a Reconciliations Specialist to lead the end-to-end process of reconciling the firm's back office accounts—including suspense, wash, bank, custodian and other proprietary accounts. In this role, you will be responsible for preparing, reviewing, resolving exceptions, and ensuring timely follow-up across all reconciliation activities. You'll work extensively in Excel or Google Sheets to verify that the firm's books and records are accurate, complete, and in balance at all times. Things You Get To Do: Own daily, weekly, and monthly reconciliations across suspense, wash, bank, and proprietary accounts. Investigate and resolve breaks with urgency and accuracy, collaborating across internal teams and external partners. Reconcile firm books and records against clearing systems, custodian records, bank feeds, and external files to ensure zero-dollar variances. Drive the cleanup and aging analysis of historical reconciliation items, ensuring root-cause resolution and documentation. Partner with engineering and product teams to improve reconciliation tools, reporting, and workflow automation. Maintain and enhance reconciliation policies, procedures, and controls as the business scales. Support audit, regulatory, and operational due diligence requests related to firm balances and reconciliation controls. Who You Are (Must-Haves): 2–5 years of experience in financial operations, preferably in a broker-dealer, clearing firm, or fintech environment. Deep understanding of reconciliation processes, with hands-on experience in bank, custodian, suspense, and internal accounts. Expert-level Excel or Google Sheets skills, including pivots, lookups, and complex formulas. Strong SQL background with ability to inner/left/right join tables to build SQL based reconciliations with unique tables. Strong attention to detail and a relentless focus on getting to zero breaks. Comfortable navigating ambiguity and solving problems proactively. Clear communicator with the ability to summarize findings for technical and non-technical stakeholders. Able to prioritize and own multiple deliverables in a fast-paced, high-growth environment. Possess a FINRA SIE or within first 180 days of employment Must be able to follow company tenets of: Stay Curious Have Empathy Be Accountable Who You Might Be ( Nice-to-Haves): FINRA Series 7 Bachelor's degree in Computer Science, Finance, Data Analytics, or a related field. Experience writing and optimizing SQL queries to support reconciliation and exception reporting. Expert with data visualization tools such as Power BI, Tableau, or Looker for tracking and presenting reconciliation metrics. Prior exposure to reconciliation or exception management tools (e.g., SmartStream TLM, Duco, FIS IntelliMatch). Knowledge of broker-dealer operations, clearing, or custody workflows. Experience working with APIs or flat-file data sources to automate reconciliation tasks. Comfort working in a startup or fast-paced fintech environment where priorities shift quickly. How We Take Care of You: Competitive Salary & Stock Options Health Benefits New Hire Home-Office Setup: One-time USD $500 Monthly Stipend: USD $150 per month via a Brex Card Alpaca is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse workforce. Recruitment Privacy Policy

Posted 2 weeks ago

Apply

1.0 years

3 - 4 Lacs

Jaipur

On-site

Location : Raniwala Jewellers , Hawa Sadak , 22 Godam , Jaipur Experience : 1 +Years Roles & Responsibility : Administrative Support: Manage calendars, schedule meetings, and coordinate travel arrangements. Prepare and review correspondence, reports, and presentations. Meeting Coordination: Organize and facilitate meetings, including preparing agendas, taking minutes, and following up on action items. Communication Management: Act as a point of contact between executives and internal/external stakeholders. Handle confidential information with discretion. Office Management: Oversee office supplies and equipment. Maintain a well-organized and efficient office environment. Project Assistance: Support various projects by providing research, coordination, and administrative assistance as needed. Event Planning: Assist in planning and executing company events, conferences, and other special activities. Task Prioritization: Handle ad-hoc tasks and special projects as requested by executives, ensuring timely completion. Soft skills : Advance excel ( for reporting and analysis) , google sheet( for reports and analysis) , google workspace ( calendar, keep, maintaining drive and other tools ) Requirements: 1. Bachelor's degree in Management Information Systems, Computer Science, or related field. 2. Proven experience in MIS reporting and data analysis, with a minimum of 1 years in a similar role. 3. Advanced Excel knowledge, including pivot tables, VLOOKUP, and macros. 4. Proficiency in Google Sheets formulas and functions for data manipulation and analysis. 5. Familiarity with Google Workspace applications such as Google Docs, Sheets, and Drive. 6. Basic understanding of Looker Studio for data visualization and analytics. 7. Strong analytical skills with the ability to translate data into actionable insights. 8. Excellent communication and presentation skills. 9. Detail-oriented with a focus on data accuracy and quality. 10. Ability to work independently and collaborate effectively with cross-functional teams. Job Type: Full-time Pay: ₹25,000.00 - ₹35,000.00 per month Application Question(s): Mention Current Salary Mention Expected Salary How soon you can Join ? Language: English (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Job Description Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We're committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a thriving internal community and are committed to creating a workplace that looks like the world that we serve. Pay And Benefits Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact You Will Have In This Role Being a member of the Data Services Platform Delivery team means you will be a part of a technology team with a rich diverse skill sets and a phenomenal hard-working committed team. Whether it’s Snowflake, Java, Spring suite, Python, data analytics, Unix, cloud computing or Database skillset required for the project initiatives, we are there for each other collaborating and helping each other to achieve the common goal. We are embarking on an incredible multi-year Data Transformation journey, and we are looking for best-of-breed software engineers to join us on this journey. We’re looking for a passionate engineer to help design and build platforms that power the next generation of data products. In this role you will be responsible for building platforms for next generation Data Products. You’ll work within the Data Platform Squad to develop secure, resilient, scalable solutions in Snowflake, Java or Python delivered to the marketplace via multiple delivery mechanisms. The Solution will be built with latest and greatest cloud tools and industry standards. This role offers strong opportunities for growth driven by your performance and contributions to our strategic goals. Qualifications Minimum 10 years of related experience Bachelor's degree (preferred) or equivalent experience Primary Responsibilities. Act as a technical expert on the development of one or more applications including design and develop robust, scalable platforms that enable transformation of data into a useful format for analysis, enhance data flow, and enable efficient consumption and analysis of data. Partner with enterprise teams to identify and deploy efficient hosting environments. Research and evaluate technical solutions consistent with DTCC technology standards. Contribute expertise to the design of components or individual programs and participate in the unit and functional testing. Collaborate with teams across the software development lifecycle, including those responsible for testing, troubleshooting, operations and production support. Aligns risk and control processes into day-to-day responsibilities to monitor and mitigate risk; escalates appropriately. Write complex performance optimal SQL queries against Snowflake. Convert logical data models to physical data models, DDL, roles and views and enhance them as required. Participate in daily scrums, project related meetings, backlog grooming, sprint planning and retrospective sessions. Ensure operational readiness of the services and meet the commitments to our customers regarding reliability, availability, and performance. Be responsible for the technical quality of the projects by ensuring that key technical procedures, standards, quality control mechanisms, and tools are properly used including performing root cause analyses for technical problems and conduct quality review. Work across functions and across teams - we don’t only work on code that we own; we work with other parts of successful delivery of data products every day. Talents Needed For Success We recognize that expertise in software development can be gained through many different paths. Below are the key skills we value for this role—not all are required, but the ones you bring should be demonstrated at an exceptional level to succeed in this position. Application development in Java and related technologies Java, J2EE, Spring (Boot, Batch, Core, MVC, JDBC,), Junit, AWS SDKs AND /OR Python, Polars/ Pandas, Snowpark, NumPy, SciPy, AWS SDKs, pytest static analyzers Sonar /Fortify with gating for code quality. Hands-on experience with databases architecture, import, export, performance techniques, data model, database table design and writing complex SQL queries. Solid Understanding of Unix/Linux OS including shell scripting, perl and/or python Solid understanding of Agile, CI/CD, Jenkins, Dev/Ops practices and tools like Maven, Jenkins, nexus, fortify, liquibase, etc. Exposure to design & architecture will be a plus Demonstrates strong analytical and interpersonal skills Experienced in working with a geographically separated (onshore + offshore) team Must understand the Agile development process and be committed to delivering assignments as planned and agreed. Ability to collaborate effectively with other developers and co-workers including distributed team members. Strong communication skills, desire to learn and contribute, self-starter and phenomenal teammate. Participate in daily scrums, project related meetings, backlog grooming, sprint planning and retrospective sessions. Nice to have Proven background in database concepts – data management, governance, modelling, and development. Snowflake Architecture, Snow SQL, Snowpark, Snow Pipe, Tasks, Streams, Dynamic Tables, Time travel, Optimizer, data sharing, and stored procedures. Design Patterns in Java/ Python, Cloud Design Pattern Time Series Analysis for financial data Experience with any BI tools such as QuickSight, Looker, PowerBI is a plus. Familiarity with container technologies like Docker, Kubernetes, OpenShift will be a plus. Proven understanding of Agile, CI/CD, Dev/Ops practices and tools. AWS experience Excellent oral and written English Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. About Us With over 50 years of experience, DTCC is the premier post-trade market infrastructure for the global financial services industry. From 20 locations around the world, DTCC, through its subsidiaries, automates, centralizes, and standardizes the processing of financial transactions, mitigating risk, increasing transparency, enhancing performance and driving efficiency for thousands of broker/dealers, custodian banks and asset managers. Industry owned and governed, the firm innovates purposefully, simplifying the complexities of clearing, settlement, asset servicing, transaction processing, trade reporting and data services across asset classes, bringing enhanced resilience and soundness to existing financial markets while advancing the digital asset ecosystem. In 2024, DTCC’s subsidiaries processed securities transactions valued at U.S. $3.7 quadrillion and its depository subsidiary provided custody and asset servicing for securities issues from over 150 countries and territories valued at U.S. $99 trillion. DTCC’s Global Trade Repository service, through locally registered, licensed, or approved trade repositories, processes more than 25 billion messages annually. To learn more, please visit us at www.dtcc.com or connect with us on LinkedIn , X , YouTube , Facebook and Instagram . DTCC proudly supports Flexible Work Arrangements favoring openness and gives people freedom to do their jobs well, by encouraging diverse opinions and emphasizing teamwork. When you join our team, you’ll have an opportunity to make meaningful contributions at a company that is recognized as a thought leader in both the financial services and technology industries. A DTCC career is more than a good way to earn a living. It’s the chance to make a difference at a company that’s truly one of a kind. Learn more about Clearance and Settlement by clicking here . About The Team IT Architecture and Enterprise Services are responsible for enabling digital transformation of DTCC. The group manages complexity of the technology landscape within DTCC and enhances agility, robustness and security of the technology footprint. It does so by serving as the focal point for all technology architectural activities in the organization as well as engineering a portfolio of foundational technology assets to enable our digital transformation.

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

We’re looking for a hands-on Business Analyst who’s comfortable working with data, people, and a bit of ambiguity. The role is about understanding business problems, diving into numbers, and helping teams make better decisions with clear, actionable insights. You’ll work closely with different teams to figure out what’s needed, identify gaps, and help shape solutions. It’s a mix of asking the right questions, pulling the right data, and presenting it in a way that actually moves things forward. What you’ll mostly be doing is analyzing datasets, building reports or dashboards, and communicating what the numbers really mean . You’ll also play a key role in improving internal processes, documenting what’s working, and suggesting better ways to get things done. We’re not too focused on degrees , what matters is how you think and whether you can get things done. You should be comfortable with Excel or Google Sheets, know your way around SQL, and have worked with at least one BI tool like Power BI, Tableau, or Looker. If you’ve used Notion, Confluence, or similar documentation tools, even better. Basic Python or R skills for cleaning or automating data tasks are a plus, but not a must-have. Experience with CRMs, ERPs, or working in Agile teams would be helpful, but we’re open as long as you’re curious and proactive. If this sounds like a good fit, send over something you’ve worked on — could be a dashboard, a case study, or even a quick write-up of how you solved a problem using data. We care more about how you approach problems than what’s on your resume.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At myKaarma, we’re not just leading the way in fixed ops solutions for the automotive industry—we’re redefining what’s possible for dealership service centers. Headquartered in Long Beach, California, and powered by a global team, our industry-leading SaaS platform combines communication, scheduling, and payment tools in one seamless solution that keeps dealerships and vehicle owners connected. With myKaarma, every service interaction flows effortlessly, bringing good karma to customers and service teams. Rooted in the principles of the Toyota Production System, we operate with precision, efficiency, and a relentless focus on continuous improvement to deliver a better experience for all. We’re looking for innovators, problem-solvers, and tech enthusiasts passionate about building solutions that people love to use. If you’re ready to make an impact in an industry ripe for change, join us at myKaarma and help shape the future of automotive service. Role Description We are building a modern data lake architecture centered around BigQuery and Looker, and we’re looking for a hands-on Looker Data Engineer/Architect to help us shape and scale our data platform. In this role, you’ll own the design and implementation of Looker Views, Explores, and Dashboards, working closely with data stakeholders to ensure accurate, efficient, and business-relevant insights. You’ll play a critical role in modelling our existing data architecture into LookML, and driving modelling and visualization best practices across the organization. This will also include reviewing our existing data lake models and identifying inefficiencies/areas of improvement. This role also offers the opportunity to integrate AI/ML in our data lake and provide intelligent insights and recommendations to our internal as well as external customers. Key Responsibilities Design and develop LookML models, views, and explores based on our legacy data warehouse in MariaDB Create and maintain high-quality dashboards and visualizations in Looker that deliver actionable business insights Collaborate with engineers, product managers, and business stakeholders to gather requirements and translate them into scalable data models Guide other engineers and non-technical staff on how to build and maintain Looker dashboards and models. Ensure data accuracy, performance, and efficiency across our Looker and BigQuery resources Maintain strong ownership over the Looker platform, proactively improving structure, documentation, and data usability Monitor and troubleshoot data issues in Looker and BigQuery Required Skills And Qualifications 5+ years of experience in data engineering and 2+ years of hands-on experience with Looker, including LookML modeling and dashboard development Strong experience with Google BigQuery, including writing and optimizing complex SQL queries, and managing BigQuery costs Experience with building and maintaining projects in Google Cloud Experience implementing row-level security, access controls, or data governance in Looker Proven ability to manage and own end-to-end Looker projects with minimal supervision Experience with source control systems, preferably git Excellent communication skills and a strong sense of ownership and accountability Comfortable working in a fast-paced, collaborative environment Nice To Have Skills & Qualifications Familiarity with batch processing, stream processing and real-time analytics Familiarity with MySQL queries and syntax Being able to understand and write java code We value diverse experiences and backgrounds, so we encourage you to apply if you meet some but not all of the listed qualifications. Total Rewards at myKaarma Benefits At myKaarma, we offer a comprehensive Total Rewards package that extends beyond the base salary. Our commitment to competitive compensation includes bonuses and benefits that support both personal and professional well-being: Flexible Work Environment: We embrace a high-performance, flexible structure that values freedom and responsibility. Our “Highly Aligned, Loosely Coupled” model empowers teams to innovate and continuously improve using data-driven insights. Health and Wellness: Comprehensive medical, life, and disability benefits. Time Off: Generous vacation time to recharge and balance life outside work. In-Office Perks: Work in an agile office space with perks like ping pong and foosball to unwind and connect and unlimited lunch, snacks or refreshments onsite. Our Commitment to Inclusion At myKaarma, diverse perspectives drive innovation and success. We are committed to creating a safe, welcoming, and inclusive workplace where every employee feels valued and empowered and can do meaningful work. Our mission to deliver exceptional solutions to our clients is strengthened by the unique contributions and perspectives of our team members from all backgrounds. As an equal opportunity employer, myKaarma prohibits any form of unlawful discrimination or harassment based on race, color, religion, gender, gender identity, gender expression, sexual orientation, national origin, family or parental status, disability, age, veteran status, or any other status protected by applicable laws in the regions where we operate. We adhere to all EEOC regulations and actively promote an environment that celebrates and supports diversity, equity, and inclusion for all. Applicants with disabilities may be entitled to reasonable accommodation under the terms of the Americans with Disabilities Act and certain state or local laws. Reasonable accommodation is a change in the way things are normally done, which will ensure an equal employment opportunity without imposing undue hardship on myKaarma. Please let us know if you require reasonable accommodations during the application or interview process by filling out this form. myKaarma participates in the E-Verify Program .

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

About Birlasoft: Birlasoft is a global leader in Cloud, AI, and Digital technologies, leveraging domain expertise to provide innovative enterprise solutions. With a consultative and design-thinking approach, Birlasoft empowers societies worldwide, enhancing business efficiency and productivity. As a part of the multibillion-dollar diversified CKA Birla Group, Birlasoft, comprising of 12,000+ professionals, is dedicated to upholding the Group's 170-year legacy of fostering sustainable communities. Job Title: AWS Redshift Expert Role Overview: We are looking for a highly skilled AWS Redshift Expert to join our data engineering team. This role is pivotal in supporting our AWS ProServe engagements and internal analytics initiatives. The ideal candidate should possess in-depth knowledge of Redshift architecture, performance tuning, and integration with BI tools like Looker. You will collaborate closely with cross-functional teams, including AWS tech leads, data analysts, and client stakeholders, to ensure the development of scalable, secure, and high-performing data solutions. Key Responsibilities: - Design, deploy, and manage AWS Redshift clusters for large-scale data warehousing. - Optimize query performance using DISTKEY, SORTKEY, and materialized views. - Work with BI teams to enhance LookML models and boost dashboard performance. - Conduct performance benchmarking and establish automated alerts for performance degradation. - Lead data migration projects from platforms like BigQuery to Redshift. - Ensure the implementation of data security, compliance, and backup/recovery protocols. - Provide technical leadership in client interviews and solution discussions. Required Skills & Experience: - Minimum of 5 years of experience in data engineering, with at least 3 years specializing in AWS Redshift. - Hands-on expertise in Redshift performance tuning and workload management. - Familiarity with BI tools such as Looker, Power BI, and semantic layer optimization. - Proficiency in cloud architecture and AWS services like EC2, S3, IAM, and VPC. - Excellent communication skills for effective interaction with clients and internal leadership.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a Senior Frontend Data Visualization Engineer at Bidgely, you will play a crucial role in creating exceptional UI experiences for energy analytics applications. Leveraging your expertise in React.js and Looker, you will be responsible for developing, optimizing, and maintaining interactive dashboards and web applications. Your focus will be on ensuring seamless production support and deployment while turning data into actionable insights. If you are a problem-solver who thrives in a collaborative environment, we are looking for someone like you. Your key responsibilities will include Frontend Development & Optimization, Post-Release Monitoring & Performance Analysis, Collaboration & Communication, and Documentation & Release Management. You will be developing and maintaining high-performance React.js applications, designing and optimizing Looker dashboards, implementing advanced filtering and drill-down capabilities, and ensuring cross-browser compatibility and responsiveness. Additionally, you will monitor the performance and stability of deployed applications, troubleshoot production issues, collaborate with product teams, and provide technical solutions to stakeholders. To excel in this role, you should have at least 2 years of experience in BI development and Data Analytics in cloud platforms. Proficiency in React.js and Looker, strong SQL skills, experience with REST APIs, and familiarity with CI/CD pipelines are essential. You should also possess excellent collaboration, communication, and problem-solving skills, along with a strong understanding of non-functional requirements related to security, performance, and scale. Experience with Git, Confluence, and Notion for version control and documentation is preferred. In return, Bidgely offers Growth Potential with a Startup, a Collaborative Environment, Unique Tools for your role, Group Health Insurance, Internet/Telephone Reimbursement, Professional Development Allowance, Gratuity, Mentorship Programs from industry experts, and Flexible Work Arrangements. Bidgely is an equal-opportunity employer that values diversity and equal opportunity. Your hiring will be based on your skills, talent, and passion, without any bias towards your background, gender, race, or age. Join us in building a better future and a better workforce at Bidgely, an EVerify employer.,

Posted 2 weeks ago

Apply

6.0 - 10.0 years

0 - 0 Lacs

maharashtra

On-site

Job Description About the Job WonDRx (pronounced as Wonder-Rx) is an innovative and disruptive technology platform in healthcare, aiming to connect patients, doctors, and the entire healthcare ecosystem on a single platform. We are looking for a Data Analytics and Research Manager (AI-driven) to lead analytics and insights strategy aligned with our fast-growing product and business goals. This person will manage data pipelines, apply AI/ML models, perform healthcare research, and build a small but high-performing analytics team. Key Responsibilities Define and lead the data and analytics roadmap. Design and manage health data pipelines, dashboards, and KPIs. Apply ML/NLP for patient behavior prediction and analytics automation. Conduct market and competitor research to support business strategies. Collaborate across teams and present insights to CXOs. Mentor a data analytics team ensuring accuracy and impact. Tools & Technologies Languages: SQL, Python/R AI/ML: scikit-learn, TensorFlow BI Tools: Power BI, Tableau, Looker Cloud Stack: BigQuery, Snowflake, AWS, Databricks GenAI Tools: ChatGPT, Copilot, Custom LLMs Qualifications Bachelors/Masters in Data Science, Statistics, Engineering, or related. 6-10 years in analytics with at least 2+ years in a leadership role. Strong business acumen, preferably in healthcare/life sciences. Hands-on AI/ML experience. Excellent communication and storytelling skills. Join us to transform the healthcare experience for millions.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

As a Talent Acquisition Executive/Lead at Saarthee, you will play a pivotal role in driving talent acquisition strategies to support the company's growth objectives. Your primary responsibility will be to collaborate with the HR department, business leaders, and hiring managers to identify, attract, and hire top talent in the data analytics industry and related fields. If you are passionate about building high-performing teams and have a proven track record in sourcing, hiring, and retaining top talent, this exciting opportunity is for you. Your key responsibilities will include leading the end-to-end recruitment process for technical roles in Data Engineering, Data Science, and Data Analytics. This will involve assessing candidates" proficiency in programming languages such as Python, Java, and Scala, data pipelines like ETL and Kafka, cloud platforms including AWS, Azure, and GCP, as well as big data technologies like Hadoop and Spark. You will design and implement technical assessment processes to ensure candidates meet the high technical standards required for projects and collaborate with stakeholders to understand specific technical requirements for each role. Furthermore, you will be responsible for building and maintaining a robust pipeline of highly qualified candidates using various sourcing techniques and staying updated with industry trends in Data Engineering, Data Science, and Analytics. Your role will also involve implementing strategies to ensure diverse and inclusive hiring practices, focusing on underrepresented groups in technology, and working on talent development and retention initiatives within the company. To be successful in this role, you should have at least 4 years of experience in Talent Acquisition, with a strong background in recruiting for Data Engineering, Data Science, and Technology roles. You should possess technical knowledge in AI/ML, programming languages like Python, R, and Java, big data technologies such as Hadoop and Spark, cloud platforms like AWS, Azure, and GCP, and analytics tools like Tableau and Power BI. Leadership skills, analytical thinking, excellent communication, and a commitment to excellence are essential qualities required for this position. In addition to technical skills, soft skills such as problem-solving, collaboration, adaptability, attention to detail, continuous learning, and excellent verbal and writing skills are also crucial for success in this role. If you are ready to take on the challenge of shaping the talent acquisition strategy for a dynamic and innovative company like Saarthee, we encourage you to apply for this role and be a part of our journey towards success.,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

You are sought after to take on the role of Lead Data Consultant at Eucloid, where your high energy and collaborative skills will be pivotal in engaging with client stakeholders across various projects within the Marketing Analytics domain. Your primary responsibilities will include partnering with eCommerce and Digital Marketing leaders to provide key marketing insights, ensuring scalability and effectiveness of marketing solutions, and connecting data dots across multiple sources to drive analyses that lead to long-term solutions. Additionally, you will be tasked with developing data-driven hypotheses, enhancing campaign performance, and delivering technical solutions tailored for email marketing operations and analytics. To qualify for this role, you should possess an Undergraduate Degree in a quantitative discipline from a Top-Tier institution, with an MBA being a desirable asset. A minimum of 4 years of experience in Data Analytics/Marketing Analytics or client-facing roles is required, along with proficiency in SQL, Python, and data visualization tools. Moreover, familiarity with various analytics, CRM, attribution, and planning tools is essential. Your skills will be put to the test, as you demonstrate excellent analytical, troubleshooting, decision-making, and time management abilities with keen attention to detail. You should also showcase expertise in understanding technology stacks, application dependencies, and effective project management, alongside a proven track record of leadership and team development. In summary, as the Lead Data Consultant at Eucloid, you will play a crucial role in influencing strategic decisions, driving marketing insights, and delivering scalable technical solutions that enhance campaign performance and automation. Your expertise in data analytics, proficiency in relevant tools, and exceptional communication and leadership skills will be key in navigating complex projects effectively and ensuring impactful outcomes. If you are ready to take on this challenging yet rewarding opportunity, join the Eucloid team and make a significant impact in the world of Marketing Analytics.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You are an experienced Data Engineer who will be responsible for leading the end-to-end migration of the data analytics and reporting environment to Looker at Frequence. Your role will involve designing scalable data models, translating business logic into LookML, and empowering teams across the organization with self-service analytics and actionable insights. You will collaborate closely with stakeholders from data, engineering, and business teams to ensure a smooth transition to Looker, establish best practices for data modeling, governance, and dashboard development. Your responsibilities will include: - Leading the migration of existing BI tools, dashboards, and reporting infrastructure to Looker - Designing, developing, and maintaining scalable LookML data models, dimensions, measures, and explores - Creating intuitive, actionable, and visually compelling Looker dashboards and reports - Collaborating with data engineers and analysts to ensure consistency across data sources - Translating business requirements into technical specifications and LookML implementations - Optimizing SQL queries and LookML models for performance and scalability - Implementing and managing Looker's security settings, permissions, and user roles in alignment with data governance standards - Troubleshooting issues and supporting end users in their Looker adoption - Maintaining version control of LookML projects using Git - Advocating for best practices in BI development, testing, and documentation You should have: - Proven experience with Looker and deep expertise in LookML syntax and functionality - Hands-on experience building and maintaining LookML data models, explores, dimensions, and measures - Strong SQL skills, including complex joins, aggregations, and performance tuning - Experience working with semantic layers and data modeling for analytics - Solid understanding of data analysis and visualization best practices - Ability to create clear, concise, and impactful dashboards and visualizations - Strong problem-solving skills and attention to detail in debugging Looker models and queries - Familiarity with Looker's security features and data governance principles - Experience using version control systems, preferably Git - Excellent communication skills and the ability to work cross-functionally - Familiarity with modern data warehousing platforms (e.g., Snowflake, BigQuery, Redshift) - Experience migrating from legacy BI tools (e.g., Tableau, Power BI, etc.) to Looker - Experience working in agile data teams and managing BI projects - Familiarity with dbt or other data transformation frameworks At Frequence, you will be part of a dynamic, diverse, innovative, and friendly work environment that values creativity and collaboration. The company embraces differences and believes they drive creativity and innovation. The team consists of individuals from varied backgrounds who are all trail-blazing team players, thinking big and aiming to make a significant impact. Please note that third-party recruiting agencies will not be involved in this search.,

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We’re Hiring: SEO & AIO Specialist Location: Bangalore Experience: 2–5 Years We’re looking for a sharp, data-driven SEO Specialist who can go beyond keywords and rankings – someone who understands how search is evolving in the era of AI and voice-first discovery. You’ll work across our client accounts and internal brand to optimize for Google, ChatGPT, Perplexity, Gemini & beyond. Key Responsibilities: •Develop and execute end-to-end SEO strategies (on-page, off-page, technical) to drive organic traffic & conversions. •Implement AISEO best practices for optimizing content for ChatGPT, Bing Copilot, Gemini, Perplexity, etc. •Build Answer Engine Optimization (AEO) frameworks to rank for voice queries, featured snippets, and direct answers. •Optimize for Generative Engine Optimization (GEO): prompt-ready content structures, structured data, and brand visibility in AI-generated responses. •Execute AIO strategies – combining SEO with AI tools for auto-generated content, keyword clustering, & entity optimization. •Conduct regular SEO audits, competitor benchmarking, and algorithm-proof optimization. •Work closely with design/content/dev teams to ensure site speed, mobile-first structure & schema markup. •Track performance via GA4, GSC, Looker Studio, and other reporting dashboards.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Job Description Sr Data Engineer (Python) - Bangalore We are seeking an experienced Senior Data Engineer with a strong background in business intelligence, data engineering, and a passion for turning complex data into valuable insights that drive business decisions. If you are interested in applying AI and have experience in mentoring junior members and collaborating with peers, we invite you to join our Bangalore office as a Senior Data Engineer. The Opportunity As a Senior Data Engineer, you'll develop data pipeline solutions that help address business data needs, which require an understanding of the business context and technical skills to create reliable solutions. Beyond data pipelines, this rule must help be a data storyteller and have exposure building and using AI. You'd also mentor junior members and collaborate with peers. What You'll Do Engineering Design, Implement, Maintain: Structured data models typically in Cloud Databases Semi-structured data models on storage buckets Python & SQL for collecting, enriching, cleansing, and other transform data. Data APIs in Python Flask containers Leverage AI for analytics and accelerate of development Data visualizations and dashboards using Tableau. Optimization solutions for costs and performance Infrastructure as code (Terraform) Execute automated deployment processes on Jenkins or GitHub Actions. Technical Specification Assist business analysts with collecting stakeholders’ requirements. Translate business requirements into detailed technical specifications. Create as-build documentation and run books for operations. Continuously Learn Stay updated on the latest technical advancements, especially within GenAI Recommend changes based on advancements in Data Engineering and AI Embrace change and collaborate with team members by sharing and learning knowledge What You'll Bring Experience: 5+ years of experience in data engineering, with a strong focus on Python programming, data pipeline development, and API design. Proven experience working with container technologies such as Docker Hands-on experience leverage AI. Proficiency in SQL and experience working with various relational and NoSQL databases. Strong knowledge of data warehousing concepts, ETL processes, and data modeling techniques. Excellent problem-solving skills, attention to detail, and the ability to work independently and as part of a team. Experience with cloud-based (AWS, GCP, Azure) data storage and processing platforms Bonus Skills GenAI prompt engineer, Machine Learning (TensorFlow, PyTorch or Similar) Knowledge of big data technologies Experience with Pandas, Spacy and NLP libraries Familiarity with data visualization tools such as Tableau, Power BI, or Looker. Experience in agile development methodologies Optimization of data pipeline for costs and performance Communication Strong communication and collaboration skills in English, with the ability to work effectively with both technical and non-technical stakeholders. Experience translating complex ideas into simple examples Education & Certifications Bachelor’s degree in computer science, IT, engineering or a related field Relevant certifications in BI, AI, data engineering, or data visualization tools are highly desirable. The Location: This role will be based out of The Leela Office located on the 4th Floor, Airport Road, Kodihalli, Bangalore- 560008. Our expectation at this time, is that you would work HYBRID – work from our office on Tuesdays, Wednesdays, Thursdays with flexibility to work from home on Mondays and Fridays. Work timing - 2 pm to 11 pm IST (Cab pickup and drop available)

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies