Home
Jobs
Companies
Resume

1459 Looker Jobs

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 10.0 years

5 - 18 Lacs

India

On-site

Overview: We are looking for a skilled GCP Data Engineer with 3 to 10 years of real hands-on experience in data ingestion, data engineering, data quality, data governance, and cloud data warehouse implementations using GCP data services. The ideal candidate will be responsible for designing and developing data pipelines, participating in architectural discussions, and implementing data solutions in a cloud environment. Key Responsibilities:  Collaborate with stakeholders to gather requirements and create high-level and detailed technical designs.  Develop and maintain data ingestion frameworks and pipelines from various data sources using GCP services.  Participate in architectural discussions, conduct system analysis, and suggest optimal solutions that are scalable, future-proof, and aligned with business requirements.  Design data models suitable for both transactional and big data environments, supporting Machine Learning workflows.  Build and optimize ETL/ELT infrastructure using a variety of data sources and GCP services.  Develop and implement data and semantic interoperability specifications.  Work closely with business teams to define and scope requirements.  Analyze existing systems to identify appropriate data sources and drive continuous improvement.  Implement and continuously enhance automation processes for data ingestion and data transformation.  Support DevOps automation efforts to ensure smooth integration and deployment of data pipelines.  Provide design expertise in Master Data Management (MDM), Data Quality, and Metadata Management. Skills and Qualifications:  Overall 3-10 years of hands-on experience as a Data Engineer, with at least 2-3 years of direct GCP Data Engineering experience.  Strong SQL and Python development skills are mandatory.  Solid experience in data engineering, working with distributed architectures, ETL/ELT, and big data technologies.  Demonstrated knowledge and experience with Google Cloud BigQuery is a must.  Experience with DataProc and Dataflow is highly preferred.  Strong understanding of serverless data warehousing on GCP and familiarity with DWBI modeling frameworks.  Extensive experience in SQL across various database platforms.  Any BI tools Experience is also preferred.  Experience in data mapping and data modeling.  Familiarity with data analytics tools and best practices.  Hands-on experience with one or more programming/scripting languages such as Python, JavaScript, Java, R, or UNIX Shell.  Practical experience with Google Cloud services including but not limited to: o BigQuery, BigTable o Cloud Dataflow, Cloud Dataproc o Cloud Storage, Pub/Sub o Cloud Functions, Cloud Composer o Cloud Spanner, Cloud SQL  Knowledge of modern data mining, cloud computing, and data management tools (such as Hadoop, HDFS, and Spark).  Familiarity with GCP tools like Looker, Airflow DAGs, Data Studio, App Maker, etc.  Hands-on experience implementing enterprise-wide cloud data lake and data warehouse solutions on GCP.  GCP Data Engineer Certification is highly preferred. Job Type: Full-time Pay: ₹500,298.14 - ₹1,850,039.92 per year Benefits: Health insurance Schedule: Rotational shift Work Location: In person

Posted 11 hours ago

Apply

4.0 years

0 Lacs

Gurgaon

Remote

Job Title: Senior Digital Analyst (MCI) Location: Chandigarh, India Department: Data Analyst Job Type: Full-Time About Us: TRU IT is a global leader dedicated to leveraging cutting-edge technology to drive business innovation and growth. We specialize in crafting data driven digital strategies, optimizing marketing performance, and delivering transformative insights that empower businesses. Our expertise spans multiple industries, combining advanced analytics, digital marketing, and emerging technologies to drive measurable results. Position Overview: We are seeking an experienced Senior Digital Analyst with a strong background in Marketing Cloud Intelligence (MCI) , BigQuery , Snowflake , etc. The ideal candidate will be responsible for managing and analyzing marketing data, optimizing performance, and developing data-driven strategies to enhance business growth. This role requires expertise in MCI (formerly Datorama) for reporting, visualization, and campaign performance tracking . Job Location and Address: This is a full-time onsite role (no hybrid or remote option) at the following location: Plot No E 275, Industrial Area, Sector 75, Sahibzada Ajit Singh Nagar, Punjab 160071 Responsibilities: 1. Marketing Cloud Intelligence (MCI) & Digital Analytics: Manage and optimize MCI dashboards to track marketing performance, campaign effectiveness, and business KPIs. Develop custom data models within MCI to aggregate, clean, and transform marketing data from multiple sources. Automate data ingestion and transformation pipelines in MCI for seamless reporting. Perform advanced marketing analytics to identify trends, improve attribution models, and enhance campaign effectiveness. 2. Data Management & Integration: Develop and maintain data architectures using Snowflake, BigQuery etc. Extract, process, and analyze large datasets from multiple marketing platforms. Integrate MCI with Google Analytics, Adobe Analytics, and Google Tag Manager to consolidate reporting and drive actionable insights. Optimize ETL pipelines to ensure efficient data processing and reporting. 3. Performance Reporting & Business Insights: Develop custom dashboards in MCI, Looker Studio, and Excel for marketing performance tracking. Analyze multi-channel marketing campaigns (PPC, social, programmatic) and provide optimization recommendations. Deliver monthly, quarterly, and ad-hoc reports to key stakeholders on marketing performance and ROI. Conduct cohort and segmentation analysis to improve customer retention and acquisition strategies. 4. Collaboration & Strategy: Work closely with marketing, product, and data teams to align data-driven insights with business goals. Provide recommendations to optimize budget allocation, audience targeting, and media spend efficiency. Stay updated on MCI enhancements, industry trends, and new analytics tools. Requirements: Bachelor’s or Master’s degree in Data Science, Computer Science, Marketing, Business Analytics, or a related field. 4+ years of experience in digital marketing analytics, business intelligence, and data management. Proven expertise in MCI (Marketing Cloud Intelligence/Datorama), including dashboard development and data transformations. Strong hands-on experience with Snowflake , BigQuery , and SQL . Experience in Adobe Analytics, Google Analytics (GA4) , etc. Experience in ETL processes, API integrations , and marketing data automation . Strong analytical, problem-solving, and communication skills. Ability to work in a fast-paced environment, managing multiple projects and deadlines. What We Offer: Competitive salary and benefits package. Opportunities for professional growth and development. A collaborative and innovative work environment.

Posted 11 hours ago

Apply

0 years

12 - 20 Lacs

Gurgaon

Remote

Position: GCP Data Engineer Company Info: Prama (HQ : Chandler, AZ, USA) Prama specializes in AI-powered and Generative AI solutions for Data, Cloud, and APIs. We collaborate with businesses worldwide to develop platforms and AI-powered products that offer valuable insights and drive business growth. Our comprehensive services include architectural assessment, strategy development, and execution to create secure, reliable, and scalable systems. We are experts in creating innovative platforms for various industries. We help clients to overcome complex business challenges. Our team is dedicated to delivering cutting-edge solutions that elevate the digital experience for corporations. Prama is headquartered in Phoenix with offices in USA, Canada, Mexico, Brazil and India. Location: Bengaluru | Gurugram | Hybrid Benefits: 5 Day Working | Career Growth | Flexible working | Potential On-site Opportunity Kindly send your CV or Resume to careers@prama.ai Primary skills: GCP, PySpark, Python, SQL, ETL Job Description: We are seeking a highly skilled and motivated GCP Data Engineer to join our team. As a GCP Data Engineer, you will play a crucial role in designing, developing, and maintaining robust data pipelines and data warehousing solutions on the Google Cloud Platform (GCP). You will work closely with data analysts, data scientists, and other stakeholders to ensure the efficient collection, transformation, and analysis of large datasets. Responsibilities: · Design, develop, and maintain scalable data pipelines using GCP tools such as Dataflow, Dataproc, and Cloud Functions. · Implement ETL processes to extract, transform, and load data from various sources into BigQuery. · Optimize data pipelines for performance, cost-efficiency, and reliability. · Collaborate with data analysts and data scientists to understand their data needs and translate them into technical solutions. · Design and implement data warehouses and data marts using BigQuery. · Model and structure data for optimal performance and query efficiency. · Develop and maintain data quality checks and monitoring processes. · Use SQL and Python (PySpark) to analyze large datasets and generate insights. · Create visualizations using tools like Data Studio or Looker to communicate data findings effectively. · Manage and maintain GCP resources, including virtual machines, storage, and networking. · Implement best practices for security, cost optimization, and scalability. · Automate infrastructure provisioning and management using tools like Terraform. Qualifications: · Strong proficiency in SQL, Python, and PySpark. · Hands-on experience with GCP services, including BigQuery, Dataflow, Dataproc, Cloud Storage, and Cloud Functions. · Experience with data warehousing concepts and methodologies. · Understanding of data modeling techniques and best practices. · Strong analytical and problem-solving skills. · Excellent communication and collaboration skills. · Experience with data quality assurance and monitoring. · Knowledge of cloud security best practices. · A passion for data and a desire to learn new technologies. Preferred Qualifications: · Google Cloud Platform certification. · Experience with machine learning and AI. · Knowledge of data streaming technologies (Kafka, Pub/Sub). · Experience with data visualization tools (Looker, Tableau, Data Studio Job Type: Full-time Pay: ₹1,200,000.00 - ₹2,000,000.00 per year Benefits: Flexible schedule Health insurance Leave encashment Paid sick time Provident Fund Work from home Ability to commute/relocate: Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Required) Application Question(s): CTC Expected CTC Notice Period (days) Experience in GCP Total Experience Work Location: Hybrid remote in Gurugram, Haryana

Posted 11 hours ago

Apply

3.0 years

0 Lacs

Mohali

On-site

We are looking for a proactive, results-driven Performance Marketing Manager who will own campaign success across clients , translate business objectives into marketing outcomes, and lead execution in collaboration with freelancers and in-house teams. This is a strategic + client-facing role ideal for someone who thrives on data, understands platforms deeply, and can manage multiple projects with high accountability. Responsibilities: Campaign Strategy & Execution: Plan and implement high-performance ad campaigns across Google Ads, Meta Ads , and other relevant platforms (LinkedIn, YouTube, etc.) Define full-funnel strategy across TOFU, MOFU, BOFU Set clear goals for lead generation, ROAS, CAC, and lifetime value Client Relationship & Consultation: Directly interface with clients to understand business goals, translate into media plans, and guide ongoing improvements Own onboarding, expectation management, and ongoing reporting Regularly advise clients on campaign adjustments, CRO, creative optimization, and revenue opportunities Performance & Optimization: Track and interpret key performance metrics (CTR, CPC, CVR, CPL, ROAS, etc.) Proactively identify and resolve performance bottlenecks Recommend A/B tests across ads, landing pages, and funnel structure Cross-Functional Leadership: Collaborate with content, design, and development teams to execute strategy Manage and coordinate freelancers or junior resources to ensure campaign quality Set up internal SOPs and scalable workflows for campaign operations Platform Expertise: Stay current on platform algorithm changes and apply best practices Handle pixel tracking, attribution, audience targeting, and retargeting strategies Implement conversion tracking across channels and troubleshoot discrepancies Requirements: Must-Haves: 3-5 years of experience in performance marketing with direct exposure to Google Ads and Meta Ads Proven success managing multiple client campaigns simultaneously Strong understanding of ad platforms, audience segmentation, tracking, and optimization Experience working in or with service businesses, B2B or ecommerce (Shopify experience is a plus) Ability to independently manage strategy, client communication, and execution oversight Technical & Analytical: Comfortable reading and interpreting analytics tools (GA4, Meta Ads Manager, Google Ads, Looker Studio) Hands-on knowledge of Pixel setup, UTM tracking, attribution models Skilled in A/B testing, performance forecasting, and using performance data to make decisions Soft Skills: Strong communication and stakeholder management skills Able to take ownership and drive projects end-to-end with minimal supervision Strategic mindset and problem-solving orientation Bonus: Knowledge of CRO frameworks and tools (Hotjar, Clarity, etc.) Familiarity with AI tools, automation workflows, or marketing integrations (e.g., HubSpot, Zapier) Google Ads and/or Meta Blueprint certifications Job Types: Full-time, Permanent Pay: From ₹50,000.00 per month Benefits: Health insurance Provident Fund Schedule: Day shift Work Location: In person

Posted 11 hours ago

Apply

1.0 years

0 - 0 Lacs

India

On-site

Job Description: We are seeking a Google Workspace & AI Tools Specialist who is proficient in utilizing advanced features of Google Workspace tools and modern AI platforms to optimize workflows, automate processes, and build smart solutions. This role is ideal for a highly adaptable and tech-savvy individual who thrives in a fast-paced, problem-solving environment and is eager to learn and implement cutting-edge technologies. Key Responsibilities: Develop and manage complex Google Sheets using advanced formulas, data validation, pivot tables, conditional formatting. Work with Google Docs for template creation, dynamic documentation, and collaboration. Should have basic knowledge of Looker Studio Create and maintain Google Apps Script for automation and custom tool development Utilize Gmail efficiently for filters, templates, integrations, and productivity enhancements. Advanced Excel Skills: Handle datasets with advanced Excel techniques such as VLOOKUP/XLOOKUP, INDEX-MATCH, PivotTables, Macros. AI & Automation: Create effective AI prompts to leverage tools like ChatGPT, Gemini, or similar, for automation, research, content generation, and problem-solving. Stay updated with the latest AI tools and identify areas to integrate AI for increased efficiency Website & Content Management: Manage or support basic website creation using Google Sites or similar tools. Learning & Development: Explore and adopt new technologies quickly with minimal guidance. Requirements: Proven experience working with Google Workspace at an advanced level Strong knowledge of Advanced Excel and analytical tools Experience in Looker Studio and Google Apps Script (JavaScript knowledge is a plus) Familiarity with using AI platforms like ChatGPT, Gemini, etc., to solve problems creatively. Good written and verbal communication skills. Knowledge of Notebook LM Preferable. Problem-solving mindset and proactive attitude. Bachelor's degree in any discipline (IT/Computer Science/Data Analytics preferred but not mandatory). Preferred Qualifications: Knowledge of automation platforms like Zapier or Make. Knowledge of APIs and data integration preferable. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹35,000.00 per month Benefits: Leave encashment Schedule: Day shift Fixed shift Monday to Friday Supplemental Pay: Yearly bonus Ability to commute/relocate: Andheri East, Mumbai, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Experience: Management information System: 1 year (Required) Work Location: In person

Posted 11 hours ago

Apply

0 years

0 - 0 Lacs

Mumbai

On-site

JD For MIS Executive Job Summary: We are seeking a detail-oriented and proactive MIS executive to join our team. The ideal candidate will have experience in managing and analyzing data, generating reports, and providing insights to support business decision-making. You will be responsible for maintaining and improving our management information systems to ensure data accuracy and efficiency. Key Responsibilities: Develop and manage MIS reports and dashboards, providing accurate and timely data to management and other stakeholders. Analyze complex data sets to identify trends, patterns, and insights that support strategic business decisions. Ensure the accuracy and integrity of data across various systems and databases. Collaborate with different departments to understand their reporting needs and develop customized solutions. Assist in the implementation and maintenance of MIS software and tools. Monitor system performance and troubleshoot issues as needed. Prepare and present reports and presentations to senior management. Stay updated with industry trends and best practices in MIS and data management. Preferred Skills: A Data Management Executive who is good with Excel and Google sheets. He should have knowledge of VLOOKUP, Macros , if possible. Should know Pivot tables. A background in mathematics with help. Proven experience as an MIS Executive or in a similar role. Knowledge of Looker Studio. Preference:- Need BCI Candidates only who are well trained with Rahul Jain Course and need Mumbai based candidates only. Job Types: Full-time, Permanent Pay: ₹40,000.00 - ₹50,000.00 per month Application Question(s): How many years of total relevant experience you have? What is your current CTC? What is your expected CTC? What is your Notice period? Are you based out of Mumbai location? Have you done Rahul jain course training? Work Location: In person

Posted 11 hours ago

Apply

0 years

0 Lacs

Chennai

Remote

Chennai, India Hyderabad, India Bangalore, India Job ID: R-1077091 Apply prior to the end date: June 28th, 2025 When you join Verizon You want more out of a career. A place to share your ideas freely — even if they’re daring or different. Where the true you can learn, grow, and thrive. At Verizon, we power and empower how people live, work and play by connecting them to what brings them joy. We do what we love — driving innovation, creativity, and impact in the world. Our V Team is a community of people who anticipate, lead, and believe that listening is where learning begins. In crisis and in celebration, we come together — lifting our communities and building trust in how we show up, everywhere & always. Want in? Join the #VTeamLife. What you’ll be doing... Responsibilities: Publishing various insights & inferences for technical and senior leadership to make informed decisions. Collecting, processing, and performing statistical analysis on large datasets to discover useful information, suggest conclusions, and support decision-making Identifying, defining, and scoping moderately complex data analytics problems in the Enterprise Cyber Security domain. Developing cross-domain strategies for increased network security and resiliency of critical infrastructure, working with researchers in other disciplines Designing, developing and maintaining applications and databases by evaluating business needs, analyzing requirements and developing software systems. Researching, developing, designing and implementing machine learning algorithms for cyber threat detection in Enterprise Security and IAM functions and transform data points into objective Executing full software development life cycle (SDLC) – concept, design, build, deploy, test, release and support. Managing daily activities include but are not limited to attending project calls to groom new user stories, acting as a liaison between business and technical teams, collecting, organizing, and interpreting data using statistical tools,developing user interface components using programming languages, and visualization techniques. All aspects of a project from analysis, testing, implementation and support after launch. What we’re looking for... Experience with SQL Server/Teradata/DB2 databases. Experience with advanced analytics using R or Python in performing data analysis. Fundamental knowledge in and/or experience applying algorithms in one or more of the following Machine Learning areas: anomaly detection, one/few-shot learning, deep learning, unsupervised feature learning, ensemble methods, probabilistic graphical models, and/or reinforcement learning. Experience with visualization software like Tableau, Qlik, Looker or Thoughtspot to tell data-driven stories to business users at all levels Broad knowledge of IT Security such as end point, network and cloud Security Developing user interface components and implementing them following well-known React.js workflows (such as Flux or Redux). You will ensure that these components and the overall application are robust and easy to maintain. You will coordinate with the rest of the team working on different layers of the infrastructure. Your duties will include designing software solutions to meet project requirements, maintaining and refactoring existing code, writing tests, and fixing bugs. Ability to communicate comprehensive knowledge effectively across multi-disciplinary teams and to non-cyber experts, as well as demonstrate the proficient interpersonal skills necessary to effectively collaborate in a team environment. Following appropriate systems life cycle methodologies, Agile and Waterfall, for quality and maintainability and communicates status to IT management. Staying abreast of changes and advances in data warehousing technology. Perform the role of detective as you dig deep into the data warehouse to ensure new data requirements are not already available for the business to access, if not there, how the new data will fit in, be ingested and exposed in a usable manner You’ll need to have.. Bachelor degree with two or more years of work experience. Two or more years of professional experience in data analytics, business analysis or comparable analytics position. Ability to write SQL against a relational database in order to analyze and test data. Two or more Years of professional experience in working on IT Security domain Familiarity with RESTful APIs Experience with popular React.js workflows (such as Flux or Redux) Exposure to Threat, Risk and Vulnerability Management is added advantage Familiarity with Application dev Even better if you have one or more of the following: Bachelor degree in Computer Science/Information Systems or an equivalent combination of education and work experience Strong verbal and written communication skills Ability to work in a team environment. Familiarity with modern front-end build pipelines and tools Knowledge of modern authorization mechanisms, such as JSON Web Token When you join Verizon You’ll be doing work that matters alongside other talented people, transforming the way people, businesses and things connect with each other. Beyond powering America’s fastest and most reliable network, we’re leading the way in broadband, cloud and security solutions, Internet of Things and innovating in areas such as video entertainment. Of course, we will offer you great pay and benefits, but we’re about more than that. Verizon is a place where you can craft your own path to greatness. Whether you think in code, words, pictures or numbers, find your future at Verizon. Where you’ll be working In this hybrid role, you'll have a defined work location that includes work from home and assigned office days set by your manager. Scheduled Weekly Hours 40 Equal Employment Opportunity Verizon is an equal opportunity employer. We evaluate qualified applicants without regard to race, gender, disability or any other legally protected characteristics. Apply Now Save Saved Open sharing options Share Related Jobs Senior Engineer Consultant-AI Science Save Chennai, India, +1 other location Technology Software Engineer Consultant- III Save Chennai, India Technology Engr IV-Security Engrg Save Chennai, India, +1 other location Technology Shaping the future. Connect with the best and brightest to help innovate and operate some of the world’s largest platforms and networks.

Posted 12 hours ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Key Responsibilities Lead and architect end-to-end data migrations from on-premise and legacy systems to Snowflake, ensuring optimal performance, scalability, and cost-efficiency. Design and develop reusable data ingestion and transformation frameworks using Python. Build and optimize real-time ingestion pipelines using Kafka, Snowpipe, and the COPY command. Utilize SnowConvert to migrate and optimize legacy ETL and SQL logic for Snowflake. Design and implement high-performance Snowflake data models, including materialized views, clustering keys, and result caching strategies. Monitor resource usage and implement auto-suspend/auto-resume, query profiling, and cost-control measures to manage compute and storage effectively. Drive cost governance initiatives, providing insights into credit usage and optimizing workload distribution. Integrate Snowflake with AWS services such as S3, Lambda, Glue, and Step Functions to ensure a robust data ecosystem. Mentor junior engineers, enforce best practices in development and code quality, and champion agile data engineering practices. ________________________________________ Required Skills And Experience 10+ years of experience in data engineering with a focus on enterprise ETL and cloud data platforms. 4+ years of hands-on experience in Snowflake development and architecture. Expertise In Advanced Snowflake Features Such As Snowpark, Streams & Tasks, Secure Data Sharing, Data Masking, and Time Travel. Proven ability to architect enterprise-grade Snowflake solutions optimized for performance, governance, and scalability. Proficient in Python for building orchestration tools, automation, and reusable data pipelines. Solid knowledge of AWS services, including S3, IAM, Lambda, Glue, and Step Functions. Hands-on experience with SnowConvert or similar tools for legacy code conversion. Familiarity with real-time data streaming technologies such as Kafka, Kinesis, or other event-based systems. Strong SQL skills with proven experience in query tuning, profiling, and performance optimization. Deep understanding of legacy ETL tools, with preferable experience in Ab Initio. Exposure to CI/CD pipelines, version control systems (e.g., Git), and automated deployment practices. ________________________________________ Preferred Qualifications Bachelors degree in Computer Science, Information Technology, or a related field. Experience in migrating on-premises or mainframe data warehouses to Snowflake. Familiarity with BI/analytics tools such as Tableau, Power BI, or Looker. Knowledge of data security and compliance best practices, including data masking, RBAC, and OAuth integration. Snowflake certifications (Developer, Architect) are a strong plus. Show more Show less

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description At Ford, We move the world Forward; We are the movers of the world and the makers of the future. Every day, we roll up our sleeves and build a better world—together. At Ford, we’re all part of something bigger than ourselves, and we believe in creating data-driven solutions that power the next generation of mobility. Are you ready to change the way the world moves? The Data Engineering Technical Anchor for Integrated Services Data team acts as technical Subject Matter Expert on product functionalities, integrations, and anticipated roadmap and is responsible for designing solutions with architecture, organizing learning events for team, strictly adopting technology with standard application stack for development, leveraging development Security operations to write production ready software, takes complete responsibility of infrastructure required for the applications to work and enforces software craftmanship standards across the teams he/she works for. Responsibilities Requirements Gathering & Prioritization: Elicit and prioritize requirements from stakeholders across the teams, balancing business needs with technical feasibility and resource constraints. This includes actively engaging with data engineers, data scientists, and business users. Product Design & Development: Lead and collaborate with Architects and data engineers to design, develop, and launch new features and improvements to the data ingestion platform. This includes creating detailed product specifications, user stories, and acceptance criteria Deliver Data product as a Technical Anchor and mange cloud Data Engineer engineers, using Object Oriented software design, with Agile/Iterative development using PDO methodologies. Strictly adopts Technology and Architecture standards Work hands-on with the team and other stakeholders to deliver quality data products that meet our customer’s requirements and needs. Product Monitoring & Optimization: Monitor platform performance, user adoption, and identify areas for improvement. This includes analyzing usage data from post launch, conducting user surveys, and gathering feedback. Customer Discovery: Regularly engage with POs and PMs to understand their needs, pain points, and expectations. Work with product owners and product managers to define the features Leadership Reviews: Prepare and present regular product technical updates and performance reviews to leadership. Leverages logging tools such as Tekton, FOSSA, SonarQube, Checkmarx, Cycode to support DevOps and debug production issues Foster DevOps CI/CD infrastructure and an Automated Testing mentality and capability. Champion continuous technical improvement for the platform, pursue tech debt opportunities. 5+ Years of experience in guiding and mentoring the teams, grow technical capabilities / expertise and provide guidance to other members on the team Qualifications Bachelor's degree in Computer Science, Engineering, Business Administration, or a related field. Master's degree is a plus. 7+ years of experience in building, testing, and maintaining software applications using SQL, Python or any major programming language. Minimum 5+ years of hands-on experience with cloud-based data platforms (AWS, Azure, GCP). 5+ years of experience in designing, building, maintaining, and using GCP : BigQuery, Cloud Storage, Dataproc, Cloud Run, Artifact Registry, Vault, Secret Manager 3+ years of experience in architecting Data solution in cloud. Deep understanding of data ingestion principles, technologies, and best practices. Hands-on experience is a MUST. Minimum 5+ years of experience in building, configuring, maintaining, and decommissioning the infrastructure (on-prem or cloud). Minimum 3+ years of experience in building and maintaining CI/CD pipelines for automated application deployments using Jenkins or Tekton or any native cloud-based tool. Dev Security Operations scans like SonarQube, fossa, cycode and checkmarx. Experience with various data types (structured, unstructured, real-time, batch) Excellent communication, presentation, and interpersonal skills Strong analytical and problem-solving skills Experience with Agile development methodologies. Experience on JIRA is a plus. Experience with data visualization (Looker, Tableau) and analytics tools is a plus. Experience in guiding and mentoring the teams to build production ready applications Show more Show less

Posted 12 hours ago

Apply

3.0 years

2 - 4 Lacs

Surat

On-site

Min. 3 to 5 years experience Surat (GJ), India About Us At OptimumBrew, we are a team of listeners, problem-solvers, and digital marketing experts driven by data and fueled by innovation. With a crew of 10+ certified digital marketing professionals, we believe in a process-driven, adaptive, and human-first approach. We value both professional and personal growth, and we’re committed to creating a collaborative, fulfilling work environment. Job Description We are looking for a performance-driven PPC Executive with proven expertise in Google Ads (Mobile App Campaigns). You will own the User Acquisition (UA) strategy across Google Ads, leveraging automation and AI tools to drive scalable and cost-effective mobile app installs. This role is ideal for someone who is data-led, highly analytical, and understands the nuances of mobile user behavior and event-based optimization. Primary Objectives Manage and optimize Google App Campaigns (UAC/ACe) for high-volume installs and in-app actions. Leverage bidding strategies like tCPI, tCPA, Maximize Conversions, and advanced automation tools. Integrate and optimize with platforms like Firebase, GA4, MMPs (AppsFlyer, Adjust). Use automation scripts, APIs, and AI-powered tools for creative testing and campaign efficiency. Handle budgets of $10,000+/month while maintaining strong ROAS and LTV:CPI ratios. Roles & Responsibilities Plan, launch, manage, and scale Google App Campaigns across Search, Display, YouTube, and Play Store. Optimize campaigns for installs, in-app events, retention, CPA, ROAS, and LTV. Build and manage campaign automation using Google Ads Scripts, API, and AI tools. Continuously A/B test creatives (video, static, HTML5), keyword clusters, and bidding strategies. Collaborate with analytics and product teams to ensure accurate tracking via Firebase, GTM, GA4, and SDKs. Use AI tools (e.g., ChatGPT, Midjourney, AdCreative.ai, Copy.ai) for creative development and performance insights. Monitor campaign health using LTV:CPI ratios, churn prediction, and event-based ROAS. Build and manage reports in Looker Studio, Google Sheets, Supermetrics, etc. Stay updated on platform changes, GAID deprecation, SKAN, and privacy-first UA trends. Technical Skills You Should Have Google Ads (UAC) – Setup & optimization Firebase & GA4 – Integration & event tracking MMPs (AppsFlyer, Adjust, Branch) – Attribution & reporting GTM, SDKs, Pixels – Conversion tracking Audience segmentation & targeting A/B Testing – Creatives (video, image, text) Data analysis – Excel, Google Ads Reports, Data Studio Bidding strategies – tCPA, tCPI, Max Conversions Google Ads Editor – Bulk operations Basic ASO knowledge – App store performance alignment Key Expertise Deep understanding of Google App Campaigns: creative requirements, bidding models, and in-app event optimization. Proficiency with tools like Google Ads Editor, Firebase, GA4, and app store analytics. Proven success scaling app campaigns with $10k+ budgets. Hands-on experience with automation tools, scripts, APIs, and AI platforms. Qualification Bachelor’s Degree in Computer Science or Computer Engineering, B.Tech (CSE/ IT),BCA, MCA. Graduate in any field Experience 3–5 years of proven experience managing Google App Campaigns with strong performance metrics Benefits 22 Paid Leaves 5 Days Working Good Company Culture Health Insurance Life Insurance Pension Scheme Statutory Benefits (PF & ESIC) Salary on time Yearly Picnic Annual Sports Day Monthly Events Festival Celebrations Call to Recruiter : +91 7984453687

Posted 12 hours ago

Apply

4.0 - 6.0 years

0 - 0 Lacs

Noida

On-site

Job Title: Digital Marketing Manager (SEO, PPC & SMM Expert) Location: Noida Department: Digital Marketing & Performance Advertising Experience: 4–6 Years About the Role: The ideal candidate will possess hands-on experience across search engines (Google, Bing, Yahoo), social platforms (Meta, Instagram, YouTube, LinkedIn, X/Twitter), and display networks , managing high-budget campaigns and leading a digital team to achieve aggressive growth metrics. Key Responsibilities: 1. Search Engine Optimization (SEO): Lead the on-page, off-page, and technical SEO strategies for multiple websites and landing pages. Conduct in-depth keyword research, competitor analysis, backlink audits, and content gap identification. Collaborate with the content and web development teams to optimize site structure, page speed, and Core Web Vitals. Monitor and improve rankings, organic traffic, and domain authority using tools like Google Search Console, SEMrush, Ahrefs, Moz, Screaming Frog , etc. 2. Pay-Per-Click Advertising (PPC) / Search Engine Marketing (SEM): Strategize and manage high-performance paid ad campaigns on: Google Ads (Search, Display, Shopping, Discovery, Performance Max) Bing Ads / Microsoft Advertising Yahoo Gemini Optimize ad copies, keyword bidding strategies, audience segmentation, and negative keyword filtration to reduce CPA and boost conversions. Track and improve KPIs like CTR, Quality Score, Conversion Rate, and ROAS. 3. Social Media Marketing (SMM) & Optimization (SMO): Design and manage result-oriented ad campaigns across: Meta (Facebook & Instagram Ads Manager) YouTube Ads (TrueView, Bumper, In-Stream, Masthead) LinkedIn Ads (Sponsored Content, InMail, Lead Gen) Twitter/X Ads Pinterest Ads Drive brand awareness, engagement, traffic, and leads through detailed audience targeting, funnel-based creatives, and retargeting strategies. Manage brand pages, social calendars, influencer tie-ups, and community engagement for organic growth. 4. Team Leadership & Project Management: Lead a team of SEO analysts, paid media specialists, content creators, and graphic designers. Develop and assign project roadmaps, monitor KPIs, and ensure timely delivery of campaigns with maximum efficiency. Train, mentor, and upskill team members to keep up with algorithm and ad platform updates. 5. Reporting & Analytics: Create in-depth weekly/monthly performance reports and dashboards using Google Analytics 4 (GA4), Google Looker Studio, Tag Manager, Facebook Analytics , etc. Track attribution, customer journeys, and funnel performance to make data-driven decisions. A/B test creatives, landing pages, and audience segments to continuously improve campaign results. 6. Client Strategy & Communication (if agency-side): Understand brand objectives and propose tailored digital strategies. Conduct regular client meetings, QBRs (Quarterly Business Reviews), and pitch improvements. Collaborate with sales and business development teams for strategic input on proposals and case studies. Key Requirements: 4–5 years of experience in SEO, PPC, and SMM , with at least 1 years in a managerial or lead role. Proven success in managing large-scale campaigns with significant ROAS and ROI improvement. Deep knowledge of platform-specific ad ecosystems: Google Ads, Meta Ads, YouTube, Bing/Microsoft Ads, LinkedIn Ads, Twitter Ads Proficiency with SEO tools like Ahrefs, SEMrush, Screaming Frog, Google Search Console , and analytics platforms like GA4 and Looker Studio . Strong leadership, team coordination, and communication skills. Ability to handle multiple projects simultaneously with a focus on KPIs and deadlines. Preferred Qualifications: Google Ads Certification, Meta Blueprint Certification, HubSpot Digital Marketing Certification. Experience with both D2C and B2B digital marketing campaigns. Knowledge of affiliate marketing and influencer collaborations is a plus. What We Offer: Work on cutting-edge digital strategies for high-growth national and international brands. Dynamic and collaborative team culture with rapid learning opportunities. Access to premium digital tools and budgets. Competitive salary, performance bonuses, and professional growth plans. Job Type: Full-time Pay: ₹15,000.00 - ₹60,000.00 per month Schedule: Monday to Friday Work Location: In person

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description Global Data Insight & Analytics organization is looking for a top-notch Software Engineer who has also got Machine Learning knowledge & Experience to add to our team to drive the next generation of AI/ML (Mach1ML) platform. In this role you will work in a small, cross-functional team. The position will collaborate directly and continuously with other engineers, business partners, product managers and designers from distributed locations, and will release early and often. The team you will be working on is focused on building Mach1ML platform – an AI/ML enablement platform to democratize Machine Learning across Ford enterprise (like OpenAI’s GPT, Facebook’s FBLearner, etc.) to deliver next-gen analytics innovation. We strongly believe that data has the power to help create great products and experiences which delight our customers. We believe that actionable and persistent insights, based on high quality data platform, help business and engineering make more impactful decisions. Our ambitions reach well beyond existing solutions, and we are in search of innovative individuals to join this Agile team. This is an exciting, fast-paced role which requires outstanding technical and organization skills combined with critical thinking, problem-solving and agile management tools to support team success. Responsibilities What you'll be able to do: As a Software Engineer, you will work on developing features for Mach1ML platform, support customers in model deployment using Mach1ML platform on GCP and On-prem. You will follow Rally to manage your work. You will incorporate an understanding of product functionality and customer perspective for model deployment. You will work on the cutting-edge technologies such as GCP, Kubernetes, Docker, Seldon, Tekton, Airflow, Rally, etc. Position Responsibilities: Work closely with Tech Anchor, Product Manager and Product Owner to deliver machine learning use cases using Ford Agile Framework. Work with Data Scientists and ML engineers to tackle challenging AI problems. Work specifically on the Deploy team to drive model deployment and AI/ML adoption with other internal and external systems. Help innovate by researching state-of-the-art deployment tools and share knowledge with the team. Lead by example in use of Paired Programming for cross training/upskilling, problem solving, and speed to delivery. Leverage latest GCP, CICD, ML technologies Critical Thinking: Able to influence the strategic direction of the company by finding opportunities in large, rich data sets and crafting and implementing data driven strategies that fuel growth including cost savings, revenue, and profit. Modelling: Assessments, and evaluating impacts of missing/unusable data, design and select features, develop, and implement statistical/predictive models using advanced algorithms on diverse sources of data and testing and validation of models, such as forecasting, natural language processing, pattern recognition, machine vision, supervised and unsupervised classification, decision trees, neural networks, etc. Analytics: Leverage rigorous analytical and statistical techniques to identify trends and relationships between different components of data, draw appropriate conclusions and translate analytical findings and recommendations into business strategies or engineering decisions - with statistical confidence Data Engineering: Experience with crafting ETL processes to source and link data in preparation for Model/Algorithm development. This includes domain expertise of data sets in the environment, third-party data evaluations, data quality Visualization: Build visualizations to connect disparate data, find patterns and tell engaging stories. This includes both scientific visualization as well as geographic using applications such as Seaborn, Qlik Sense/PowerBI/Tableau/Looker Studio, etc. Qualifications Minimum Requirements we seek: Bachelor’s or master’s degree in computer science engineering or related field or a combination of education and equivalent experience. 3+ years of experience in full stack software development 3+ years’ experience in Cloud technologies & services, preferably GCP 3+ years of experience of practicing statistical methods and their accurate application e.g. ANOVA, principal component analysis, correspondence analysis, k-means clustering, factor analysis, multi-variate analysis, Neural Networks, causal inference, Gaussian regression, etc. 3+ years’ experience with Python, SQL, BQ. Experience in SonarQube, CICD, Tekton, terraform, GCS, GCP Looker, Google cloud build, cloud run, Vertex AI, Airflow, TensorFlow, etc., Experience in Train, Build and Deploy ML, DL Models Experience in HuggingFace, Chainlit, Streamlit, React Ability to understand technical, functional, non-functional, security aspects of business requirements and delivering them end-to-end. Ability to adapt quickly with opensource products & tools to integrate with ML Platforms Building and deploying Models (Scikit learn, DataRobots, TensorFlow PyTorch, etc.) Developing and deploying On-Prem & Cloud environments Kubernetes, Tekton, OpenShift, Terraform, Vertex AI Our Preferred Requirements: Master’s degree in computer science engineering, or related field or a combination of education and equivalent experience. Demonstrated successful application of analytical methods and machine learning techniques with measurable impact on product/design/business/strategy. Proficiency in programming languages such as Python with a strong emphasis on machine learning libraries, generative AI frameworks, and monitoring tools. Utilize tools and technologies such as TensorFlow, PyTorch, scikit-learn, and other machine learning libraries to build and deploy machine learning solutions on cloud platforms. Design and implement cloud infrastructure using technologies such as Kubernetes, Terraform, and Tekton to support scalable and reliable deployment of machine learning models, generative AI models, and applications. Integrate machine learning and generative AI models into production systems on cloud platforms such as Google Cloud Platform (GCP) and ensure scalability, performance, and proactive monitoring. Implement monitoring solutions to track the performance, health, and security of systems and applications, utilizing tools such as Prometheus, Grafana, and other relevant monitoring tools. Conduct code reviews and provide constructive feedback to team members on machine learning-related projects. Knowledge and experience in agentic workflow based application development and DevOps Stay up to date with the latest trends and advancements in machine learning and data science. Show more Show less

Posted 12 hours ago

Apply

1.0 - 2.0 years

0 - 0 Lacs

Jaipur

On-site

https://forms.gle/LmcjXLku6zPHoqdq7 Navrasa Fine Jewels Pvt. Ltd. (Operations Dept. – Job Application Form): Job Title: Junior MIS & Accounts Executive Company: Navrasa Fine Jewels Pvt. Ltd. Location: Jaipur, Rajasthan Department: Operations/Accounts Job Type: Full-Time Salary Range: ₹10,000 – ₹18,000 per month (CTC) About the Company Navrasa Fine Jewels Pvt. Ltd. is a premium luxury jewelry brand known for its handcrafted designs, impeccable quality, and attention to detail. As part of our growth, we are looking for a detail-oriented and technically skilled Junior MIS Executive to support our data management and billing processes. Position Overview We are seeking a Junior MIS Executive who is proficient in Advanced Excel/Google Sheets and familiar with Tally Prime software for handling day-to-day MIS tasks, stock data, and billing entries. The ideal candidate will be well-versed in formulas, reporting formats, and maintaining accurate and real-time data for business operations. Key Responsibilities Prepare, update, and maintain daily, weekly, and monthly MIS reports. Well veresed with Google App scripts, Looker Studio & Tally software for Billing preffered. Use Advanced Excel/Google Sheets functions such as VLOOKUP, HLOOKUP, INDEX-MATCH, Pivot Tables, Conditional Formatting, Data Validation, and Charts. Maintain accurate stock and inventory records using Excel and MIS tools. Process billing, sales invoices, and purchase entries using Tally Prime. Reconcile data between MIS reports and Tally for inventory and billing accuracy. Support data analysis for management decision-making. Ensure timely and accurate reporting with proper formatting and error checks. Assist in data entry, filing, and document management related to billing and inventory. Candidate Requirements Bachelor's degree in Commerce, Business, or related field (preferred). 1–2 years of experience in MIS/Data Entry/Accounts or a similar role. Proficiency in Microsoft Excel and Google Sheets (Advanced level mandatory). Working knowledge of Tally Prime for billing and accounting entries. Strong attention to detail and commitment to data accuracy. Good organizational skills with the ability to manage multiple tasks. Freshers with strong Excel skills and basic Tally knowledge may also apply. Salary & Benefits CTC Offered: ₹10,000 – ₹18,000 per month (based on skills and experience). Opportunity to work in a growing premium luxury brand. Learning exposure across operations, inventory, and finance functions. Supportive and structured work environment. How to Apply Interested candidates can apply through the following application form: Navrasa Fine Jewels Pvt. Ltd. (Operations Dept. – Job Application Form): https://forms.gle/LmcjXLku6zPHoqdq7 Job Type: Full-time, Permanent Work Schedule: Monday to Saturday | Day Shift Location: Jaipur, Rajasthan (On-site role) Relocation: Candidates must be residing in Jaipur. Immediate Joiner Preffered! Job Types: Full-time, Permanent Pay: ₹10,000.00 - ₹18,000.00 per month Benefits: Paid sick time Provident Fund Schedule: Day shift Application Question(s): Are you an Expert in Tally Software to processing Invoice/Billing? Are you well versed with Advanced Gsheets/MS Excel lookup fuctions & Pivot tables? Experience: Tally: 1 year (Required) Advanced Gsheets/MS Excel: 1 year (Required) Location: Jaipur city, Rajasthan (Required) Work Location: In person

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customers’ digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. How will you make an impact in this role? Build NextGen Data Strategy, Data Virtualization, Data Lakes Warehousing Transform and improve performance of existing reporting & analytics use cases with more efficient and state of the art data engineering solutions. Analytics Development to realize advanced analytics vision and strategy in a scalable, iterative manner. Deliver software that provides superior user experiences, linking customer needs and business drivers together through innovative product engineering. Cultivate an environment of Engineering excellence and continuous improvement, leading changes that drive efficiencies into existing Engineering and delivery processes. Own accountability for all quality aspects and metrics of product portfolio, including system performance, platform availability, operational efficiency, risk management, information security, data management and cost effectiveness. Work with key stakeholders to drive Software solutions that align to strategic roadmaps, prioritized initiatives and strategic Technology directions. Work with peers, staff engineers and staff architects to assimilate new technology and delivery methods into scalable software solutions. Minimum Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Data Virtualization Tools, Big Data, GCP, JAVA, Microservices Strong systems integration architecture skills and a high degree of technical expertise, ranging across a number of technologies with a proven track record of turning new technologies into business solutions. Should be good in one programming language python/Java. Should have good understanding of data structures. GCP /cloud knowledge has added advantage. PowerBI, Tableau and looker good knowledge and understanding. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Experience managing in a fast paced, complex, and dynamic global environment. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Preferred Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Oracle Business Intelligence (OBIEE), Tableau, MicroStrategy, Data Virtualization Tools, Oracle PL/SQL, Informatica, Other ETL Tools like Talend, Java Should be good in one programming language python/Java. Should be good data structures and reasoning. GCP knowledge has added advantage or cloud knowledge. PowerBI, Tableau and looker good knowledge and understanding. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 12 hours ago

Apply

5.0 years

0 Lacs

Greater Bengaluru Area

On-site

Linkedin logo

About the Company - At Swish, we’re redefining food delivery by combining speed, freshness, and delight. Our innovative platform ensures your favourite snacks and beverages arrive in just 10 minutes, transforming everyday cravings into exceptional moments. Backed by top investors like Accel and industry leaders, we’re a fast-growing early-stage startup on a mission to change how people experience food. About the Role - As a Growth Analyst at Swish, you’ll play a critical role in scaling our growth engine. You’ll dig deep into data to uncover insights, run experiments, and collaborate with marketing, product, and ops teams to drive business outcomes. If you thrive on solving problems, making data dance, and turning numbers into action - this role is for you. What You’ll Do - Retention Analytics: Understand user behavior, churn drivers and engagement loops to improve retention and repeat usage. Revenue & Performance Metrics: Track CAC, LTV, ROI and funnel performance to spot inefficiencies and uncover growth levers. Campaign Insights: Collaborate on lifecycle campaigns (email, push, in-app) and measure impact on conversions and reorders A/B Testing: Design experiments across product and marketing - from UX tweaks to offer strategies and drive data-backed decisions Cross-Team Collaboration: Work closely with growth, product and ops teams to drive high-impact initiatives Dashboards & Reporting: Build intuitive dashboards and reports to track KPIs and deliver insights to leadership What You’ll Need 2–5 years of experience in growth, marketing or business analytics (preferably in B2C startups or food/e-comm) Strong command over SQL, Excel/Sheets and any BI tool (Looker, Tableau, Power BI, etc.) Solid understanding of growth metrics - CAC, LTV, funnels, retention and experience with experimentation A structured problem-solver who can translate data into clear, actionable insights Strong communication skills - able to work across teams and present to leadership Nice to Have Experience in food delivery, q-commerce, or high-frequency B2C products Familiarity with paid marketing data - Google Ads, Meta, UAC, etc. Show more Show less

Posted 12 hours ago

Apply

2.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Company Description Avery Dennison Corporation (NYSE: AVY) is a global materials science and digital identification solutions company. We are Making Possible™ products and solutions that help advance the industries we serve, providing branding and information solutions that optimize labor and supply chain efficiency, reduce waste, advance sustainability, circularity and transparency, and better connect brands and consumers. We design and develop labeling and functional materials, radio frequency identification (RFID) inlays and tags, software applications that connect the physical and digital, and offerings that enhance branded packaging and carry or display information that improves the customer experience. Serving industries worldwide — including home and personal care, apparel, general retail, e-commerce, logistics, food and grocery, pharmaceuticals and automotive — we employ approximately 35,000 employees in more than 50 countries. Our reported sales in 2024 were $8.8 billion. Learn more at www.averydennison.com. Job Description Job Summary: We are looking for a detail-oriented and tech-savvy Associate Manager FP&A to join our multinational finance team. This role will focus on financial planning, forecasting, reporting, and strategic decision support across multiple regions. The ideal candidate will have advanced expertise in Google Sheets, databases, and dashboarding tools to enhance financial reporting, automation, and data-driven decision-making. Key Responsibilities: Financial Planning & Analysis (FP&A): Lead global budgeting, forecasting, and long-term financial planning processes. Develop and maintain dynamic financial models to support business decision-making. Perform variance analysis and provide insights on revenue, costs, and profitability. Data & Reporting Automation: Utilize Google Sheets, SQL databases, and BI tools (Looker, Tableau, Power BI) to create interactive financial dashboards and reports. Build automated data pipelines to streamline financial reporting and analysis. Enhance data visualization to communicate financial trends and insights effectively. Business Partnering & Strategy: Collaborate with regional finance teams, operations, and commercial teams to drive financial performance. Provide financial insights and scenario modeling to support strategic business initiatives. Optimize cost structures and working capital management. Process Improvement & Systems Optimization: Identify opportunities to automate and improve financial processes using scripts, APIs, and database integrations. Ensure data integrity and standardization across global financial systems. Support the implementation of FP&A tools and enhancements to existing finance infrastructure. Qualifications Chartered Accountant. 2-3 years of finance or related experience. Additional Information All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability, protected veteran status or other protected status. EEOE/M/F/Vet/Disabled. All your information will be kept confidential according to EEO guidelines. Show more Show less

Posted 12 hours ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Razorpay was founded by Shashank Kumar and Harshil Mathur in 2014. Razorpay is building a new-age digital banking hub (Neobank) for businesses in India with the mission is to enable frictionless banking and payments experiences for businesses of all shapes and sizes. What started as a B2B payments company is processing billions of dollars of payments for lakhs of businesses across India. We are a full-stack financial services organisation, committed to helping Indian businesses with comprehensive and innovative payment and business banking solutions built over robust technology to address the entire length and breadth of the payment and banking journey for any business. Over the past year, we've disbursed loans worth millions of dollars in loans to thousands of businesses. In parallel, Razorpay is reimagining how businesses manage money by simplifying business banking (via Razorpay X) and enabling capital availability for businesses (via Razorpay Capital). The Role Analytics Specialist will work with the central analytics team at Razorpay. This will give you an opportunity to work in a fast-paced environment aimed at creating a very high impact and to work with a diverse team of smart and hardworking professionals from various backgrounds. Some of the responsibilities include working with large, complex data sets, developing strong business and product understanding and closely being involved in the product life cycle. Roles And Responsibilities You will work with large, complex data sets to solve open-ended, high impact business problems using data mining, experimentation, statistical analysis and related techniques, machine learning as needed You would have/develop a strong understanding of the business & product and conduct analysis to derive insights, develop hypothesis and validate with sound rigorous methodologies or formulate the problems for modeling with ML You would apply excellent problem solving skills and independently scope, deconstruct and formulate solutions from first-principles that bring outside-in and state of the art view You would be closely involved with the product life cycle working on ideation, reviewing Product Requirement Documents, defining success criteria, instrumenting for product features, Impact assessment and identifying and recommending improvements to further enhance the Product features You would expedite root cause analyses/insight generation against a given recurring use case through automation/self-serve platforms You will develop compelling stories with business insights, focusing on strategic goals of the organization You will work with Business, Product and Data engineering teams for continuous improvement of data accuracy through feedback and scoping on instrumentation quality and completeness Mandatory Qualifications Bachelor's/Master’s degree in Engineering, Economics, Finance, Mathematics, Statistics, Business Administration or a related quantitative field 1-3 years of high quality hands-on experience in analytics and data science Hands on experience in SQL and Python Define the business and product metrics to be evaluated, work with engg on data instrumentation, create and automate self-serve dashboards to present to relevant stakeholders leveraging tools such as Tableau, Qlikview, Looker etc. Ability to structure and analyze data leveraging techniques like EDA, Cohort analysis, Funnel analysis and transform them into understandable and actionable recommendations and then communicate them effectively across the organization. Hands on experience in working with large scale structured, semi structured and unstructured data and various approach to preprocess/cleanse data, dimensionality reduction Work experience in Consumer-tech organisations would be a plus Developed a clear understanding of the qualitative and quantitative aspects of the product/strategic initiative and leverage it to identify and act upon existing Gaps and Opportunities Working Knowledge of A/B testing, Significance testing, supervised and unsupervised ML, Web Analytics and Statistical Learning Razorpay believes in and follows an equal employment opportunity policy that doesn't discriminate on gender, religion, sexual orientation, colour, nationality, age, etc. We welcome interests and applications from all groups and communities across the globe. Follow us on LinkedIn & Twitter Show more Show less

Posted 12 hours ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Designation: Solution Architect Office Location: Gurgaon Position Description: As a Solution Architect, you will be responsible for leading the development and delivery of the platforms. This includes overseeing the entire product lifecycle from the solution until execution and launch, building the right team & close collaboration with business and product teams. Primary Responsibilities: Design end-to-end solutions that meet business requirements and align with the enterprise architecture. Define the architecture blueprint, including integration, data flow, application, and infrastructure components. Evaluate and select appropriate technology stacks, tools, and frameworks. Ensure proposed solutions are scalable, maintainable, and secure. Collaborate with business and technical stakeholders to gather requirements and clarify objectives. Act as a bridge between business problems and technology solutions. Guide development teams during the execution phase to ensure solutions are implemented according to design. Identify and mitigate architectural risks and issues. Ensure compliance with architecture principles, standards, policies, and best practices. Document architectures, designs, and implementation decisions clearly and thoroughly. Identify opportunities for innovation and efficiency within existing and upcoming solutions. Conduct regular performance and code reviews, and provide feedback to the development team members to improve professional development. Lead proof-of-concept initiatives to evaluate new technologies. Functional Responsibilities: Facilitate daily stand-up meetings, sprint planning, sprint review, and retrospective meetings. Work closely with the product owner to priorities the product backlog and ensure that user stories are well-defined and ready for development. Identify and address issues or conflicts that may impact project delivery or team morale. Experience with Agile project management tools such as Jira and Trello. Required Skills: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with at least 3 years in a solution architecture or technical leadership role. Proficiency with AWS or GCP cloud platform. Strong implementation knowledge in JS tech stack, NodeJS, ReactJS, Experience with JS stack - ReactJS, NodeJS. Experience with Database Engines - MySQL and PostgreSQL with proven knowledge of Database migrations, high throughput and low latency use cases. Experience with key-value stores like Redis, MongoDB and similar. Preferred knowledge of distributed technologies - Kafka, Spark, Trino or similar with proven experience in event-driven data pipelines. Proven experience with setting up big data pipelines to handle high volume transactions and transformations. Experience with BI tools - Looker, PowerBI, Metabase or similar. Experience with Data warehouses like BigQuery, Redshift, or similar. Familiarity with CI/CD pipelines, containerization (Docker/Kubernetes), and IaC (Terraform/CloudFormation). Good to Have: Certifications such as AWS Certified Solutions Architect, Azure Solutions Architect Expert, TOGAF, etc. Experience setting up analytical pipelines using BI tools (Looker, PowerBI, Metabase or similar) and low-level Python tools like Pandas, Numpy, PyArrow Experience with data transformation tools like DBT, SQLMesh or similar. Experience with data orchestration tools like Apache Airflow, Kestra or similar. Work Environment Details: About Affle: Affle is a global technology company with a proprietary consumer intelligence platform that delivers consumer engagement, acquisitions, and transactions through relevant Mobile Advertising. The platform aims to enhance returns on marketing investment through contextual mobile ads and also by reducing digital ad fraud. While Affle's Consumer platform is used by online & offline companies for measurable mobile advertising, its Enterprise platform helps offline companies to go online through platform-based app development, enablement of O2O commerce and through its customer data platform. Affle India successfully completed its IPO in India on 08. Aug.2019 and now trades on the stock exchanges (BSE: 542752 & NSE:AFFLE). Affle Holdings is the Singapore based promoter for Affle India and its investors include Microsoft, Bennett Coleman &Company (BCCL) amongst others. For more details: www.affle.com About BU : Ultra - Access deals, coupons, and walled gardens based user acquisition on a single platform to offer bottom-funnel optimization across multiple inventory sources. For more details, please visit: https://www.ultraplatform.io/ Show more Show less

Posted 12 hours ago

Apply

7.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

Role Expectations: Data Collection and Cleaning: Collect, organize, and clean large datasets from various sources (internal databases, external APIs, spreadsheets, etc.). Ensure data accuracy, completeness, and consistency by cleaning and transforming raw data into usable formats. Data Analysis: Perform exploratory data analysis (EDA) to identify trends, patterns, and anomalies. Conduct statistical analysis to support decision-making and uncover insights. Use analytical methods to identify opportunities for process improvements, cost reductions, and efficiency enhancements. Reporting and Visualization: Create and maintain clear, actionable, and accurate reports and dashboards for both technical and non-technical stakeholders. Design data visualizations (charts, graphs, and tables) that communicate findings effectively to decision-makers. Worked on PowerBI , Tableau and Pythoin Libraries for Data visualization like matplotlib , seaborn , plotly , Pyplot , pandas etc Experience in generating the Descriptive , Predictive & prescriptive Insights with Gen AI using MS Copilot in PowerBI. Experience in Prompt Engineering & RAG Architectures Prepare reports for upper management and other departments, presenting key findings and recommendations. Collaboration: Work closely with cross-functional teams (marketing, finance, operations, etc.) to understand their data needs and provide actionable insights. Collaborate with IT and database administrators to ensure data is accessible and well-structured. Provide support and guidance to other teams regarding data-related questions or issues. Data Integrity and Security: Ensure compliance with data privacy and security policies and practices. Maintain data integrity and assist with implementing best practices for data storage and access. Continuous Improvement: Stay current with emerging data analysis techniques, tools, and industry trends. Recommend improvements to data collection, processing, and analysis procedures to enhance operational efficiency. Qualifications: Education: Bachelor's degree in Data Science, Statistics, Computer Science, Mathematics, or a related field. A Master's degree or relevant certifications (e.g., in data analysis, business intelligence) is a plus. Experience: Proven experience as a Data Analyst or in a similar analytical role (typically 7+ years). Experience with data visualization tools (e.g., Tableau, Power BI, Looker). Strong knowledge of SQL and experience with relational databases. Familiarity with data manipulation and analysis tools (e.g., Python, R, Excel, SPSS). Worked on PowerBI , Tableau and Pythoin Libraries for Data visualization like matplotlib , seaborn , plotly , Pyplot , pandas etc Experience with big data technologies (e.g., Hadoop, Spark) is a plus. Technical Skills: Proficiency in SQL and data query languages. Knowledge of statistical analysis and methodologies. Experience with data visualization and reporting tools. Knowledge of data cleaning and transformation techniques. Familiarity with machine learning and AI concepts is an advantage (for more advanced roles). Soft Skills: Strong analytical and problem-solving abilities. Excellent attention to detail and ability to identify trends in complex data sets. Good communication skills to present data insights clearly to both technical and non-technical audiences. Ability to work independently and as part of a team. Strong time management and organizational skills, with the ability to prioritize tasks effectively. Show more Show less

Posted 12 hours ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

Remote

Linkedin logo

About Parking Base Parking Base is an innovative, cloud-based platform designed to streamline parking business operations. From real-time inventory management to reservation systems, payment processing, and enforcement, Parking Base offers comprehensive solutions. Our platform also includes customer and back-office management tools, as well as seamless integration capabilities with other systems. Trusted by hundreds of facilities and over 100,000 accounts, Parking Base helps businesses save time and money while enhancing customer satisfaction. Visit our website https://www.parkingbase.com/ to discover more about our offerings. About the Role We are looking for an experienced Google Looker Developer to join our Software team. The ideal candidate will have a strong background in Looker development , data modeling , and BigQuery integration . You will play a key role in rapidly building scalable dashboards and configuring data connections for multiple clients. Roles & Responsibilities Design, develop, and maintain Looker dashboards, Looks, and Explores based on business requirements. Configure and optimize Looker-BigQuery connections , ensuring efficient data access and performance. Implement LookML models, views, and explores to transform raw data into usable business metrics. Collaborate with stakeholders to gather reporting requirements and translate them into technical solutions. Troubleshoot and resolve data issues, visualization errors, and performance bottlenecks in dashboards. Ensure data governance , security , and access controls are implemented within Looker. Optimize SQL queries used within LookML to ensure dashboard performance. Automate report generation and scheduling using Looker features for client deliverables. Stay updated with the latest features in Looker and BigQuery and suggest improvements. Required Skills Minimum 3 years of hands-on experience with Google Looker . Proficiency in LookML and understanding of Looker architecture . Experience working with Google BigQuery and connecting it with Looker. Strong SQL skills and understanding of data warehousing concepts. Ability to build interactive and performance-optimized dashboards. Good grasp of ETL processes , data pipelines, and BI best practices. Experience working in agile development teams and managing client expectations. Nice to Have Exposure to data transformation tools like dbt. Familiarity with Google Cloud Platform (GCP) services apart from BigQuery. Knowledge of data security standards and compliance. Experience handling multi-tenant Looker environments. Benefits to Get Excited About Competitive salary package. Comprehensive health and wellness benefits. Opportunities for professional development and growth. Dynamic and collaborative work environment. 📍 Location: Kochi, India (Remote) If you meet the qualifications and are excited about the opportunity to join our growing team, please share your resume to careers@parkingbase.com Show more Show less

Posted 12 hours ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Title - QA Manual Testing Experience - 5-8 Years Location - Pune & Gurgaon (Hybrid) Key Responsibilities: Understand business requirements and data flows to create comprehensive test plans and test cases for ETL jobs. Perform data validation and reconciliation between source systems, staging, and target data stores (DWH, data lakes, etc.). Develop and execute automated and manual tests to ensure data accuracy and quality. Work with SQL queries to validate data transformations and detect anomalies. Identify, document, and track defects and inconsistencies in data processing. Collaborate with data engineering and BI teams to improve ETL processes and data pipelines. Maintain QA documentation and contribute to continuous process improvements. Must Have Skills: Strong SQL skills – ability to write complex queries for data validation and transformation testing. Hands-on experience in ETL testing – validating data pipelines, transformations, and data loads. Knowledge of data warehousing concepts – dimensions, facts, slowly changing dimensions (SCD), etc. Experience in test case design, execution, and defect tracking . Experience with QA tools like JIRA , TestRail , or equivalent. Ability to work independently and collaboratively in an Agile/Scrum environment. Good to Have Skills: Experience with ETL tools like Informatica, Talend, DataStage , or Azure/AWS/GCP native ETL services (e.g., Dataflow, Glue). Knowledge of automation frameworks using Python/Selenium/pytest or similar tools for data testing. Familiarity with cloud data platforms – Snowflake, BigQuery, Redshift, etc. Basic understanding of CI/CD pipelines and QA integration. Exposure to data quality tools such as Great Expectations , Deequ , or DQ frameworks . Understanding of reporting/BI tools such as Power BI, Tableau, or Looker. Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Information Technology, or related field. Show more Show less

Posted 12 hours ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title: Manager/Sr Manager – E-commerce Performance Marketing (Marketplaces) Experience: 7+ Years Key Responsibilities: Marketplace Strategy & Growth Leadership · Define and execute the performance marketing strategy for leading marketplaces including Amazon, Flipkart, etc., aligned with overall business objectives. · Lead end-to-end management of advertising budgets across marketplaces, optimizing for ROAS, new customer acquisition, and contribution margins. · Drive innovation in campaign structures, bidding strategies, and product targeting to maximize efficiency and growth. Cross-Functional Stakeholder Collaboration · Work closely with Category Managers, Brand Teams, and Business Heads to align marketing efforts with product priorities, seasonal calendars, and sales targets. · Partner with the Merchandising and Supply Chain teams to ensure campaign planning is supported with adequate inventory and fulfillment readiness. · Collaborate with Finance and Revenue Management to track budget utilization, profitability, and return on ad spend (ROAS). · Drive performance marketing inputs into new product launches , ensuring GTM success through integrated digital and marketplace strategies. Agency and Platform Relationship Management · Manage and mentor performance marketing agencies—setting goals, reviewing execution plans, evaluating performance, and identifying growth opportunities. · Build strong relationships with marketplace account managers (e.g., Amazon Ads, Flipkart Ads) to unlock beta programs, insights, and co-marketing opportunities. · Evaluate and implement third-party tools and technologies for improved campaign visibility, automation, and reporting. Data, Insights, and Reporting · Lead the creation of comprehensive performance dashboards and KPI reports for internal and leadership review. · Monitor category-level performance, competitor benchmarks, customer behavior trends, and marketplace algorithm changes. · Translate data into actionable insights to improve campaign performance and inform broader e-commerce strategy. Key Requirements: · 7+ years of experience in e-commerce performance marketing , with at least 5 years managing large-scale campaigns across Amazon, Flipkart , and other marketplaces. · Proven track record of owning marketing P&L and driving double-digit growth through paid campaigns. · Strong leadership and team management skills, with experience working across multi-functional teams. · High proficiency in marketplace analytics tools (Amazon Brand Analytics, Helium 10, Perpetua, etc.) and reporting platforms (Excel, Looker Studio,etc.). · Deep understanding of marketplace algorithms, ad platforms, consumer journey, and conversion optimization. · Strong business acumen, communication skills, and stakeholder management capabilities. Preferred: · Experience in managing D2C and marketplace hybrid models . · MBA or equivalent qualification is a plus. Show more Show less

Posted 12 hours ago

Apply

5.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Are you passionate about digital marketing and want to make a real impact? At Petromin Corporation, we’re looking for a talented Digital Media Consultant to join our team in Chennai. This is an exciting opportunity to shape and grow our digital presence in India. You’ll lead campaigns across multiple channels, dive into data to uncover insights, and work with a fantastic team that’s always ready to push the boundaries. Here’s what you’ll do: Plan, launch, and fine-tune campaigns on Google, Meta (Facebook & Instagram), Twitter/X, and LinkedIn Develop smart keyword strategies to boost reach and conversions Set up conversion tracking, analyze performance, and find ways to improve every day Collaborate closely with our CRM, content, and design teams to ensure everything clicks seamlessly Optimize our website’s SEO and user experience, making sure visitors find what they’re looking for Use tools like Google Ads, Meta Ads, SEMrush, Ahrefs, Looker Studio, and LeadSquared CRM to get the job done Share regular performance updates, insights, and ideas for growth What we’re looking for: 5-7 years of hands-on experience in digital marketing and CRM A Bachelor’s degree in Marketing, Business, or something related Certifications in Google Ads and Meta Blueprint are a big plus Someone who’s detail-oriented, data-driven, and always curious about what’s next Excellent communication skills in English (and if you know Tamil, even better!) A self-starter who’s excited to work in a fast-paced, collaborative environment Why join us? We’re a close-knit team that values curiosity and collaboration You’ll have the freedom to experiment and the support to make things happen We love people who take ownership and aren’t afraid to share their ideas Show more Show less

Posted 13 hours ago

Apply

4.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Description: Data Analyst - Bizapps Team , Bangalore location Who are we? Whatfix is a data-driven digital adoption platform (DAP) that enables organizations and users to maximize the benefits of software. Whatfix acts as an interactive overlay on top of any application to guide users with real-time guidance, self-help support, and user feedback. With product analytics and AI, Whatfix enables scalable success with technology, maximizing productivity, and leveraging data-driven insights for better decision-making. The company has seven offices globally in the US, India, UK, Germany, Singapore, and Australia, and works with Fortune 500 companies around the world. Whatfix has raised $140 million to date and is backed by marquee investors including Softbank, Sequoia, Dragoneer, and Cisco Investments. “Hustle Mode ON” is the motto we live by. Whatfix has been named among the top 20 B2B tech companies like Adobe, PayPal, and Cisco. With YoY revenue growth of over 65%, we have also been recognized among the top 20 fastest-growing SaaS companies worldwide in the SaaS 1000 list. Recognized by Forrester and Everest Group as a 'Leader' in the digital adoption space, and listed by LinkedIn among one of the Top 5 startups in India in 2020 Listed in Deloitte Technology Fast 500™ among fastest-growing companies in North America for 2022 and 2021 and recognized as Great Place to Work 2022-2023 Whatfix has been named a Silver Winner in Stevie's Employer of the Year 2023. Our Customer centricity is also evident from a Customer rating of 4.67 on G2 Crowd & 4.7 on Gartner Peer Insights Whatfix is disrupting the way Application Support and Learning content is consumed by providing Contextual and Interactive WalkThroughs inside enterprise applications when a task is being performed. We provide enterprises with a Software Platform that allows them to create Interactive Guides or Flows that sit as an overlay inside any web application. Flows are Contextual—they appear based on where you are in the application (location) and who you are (role). Optimal performance and adoption of any web application are attained when there is easy access to Contextual Information inside the application when a task is being performed. What would you get to do? The Bizapps team at Whatfix is responsible for enabling the company’s operational excellence by building and managing key internal business applications that support various departments like Sales, Marketing, Customer Success, Finance, and Operations. The team is focused on providing data-driven insights, streamlining workflows, and enhancing productivity through scalable technology solutions. We are building this team from the ground up and looking for a Data Analyst who will play a critical role in analyzing, reporting, and driving improvements across internal business processes. As a Data Analyst in the Bizapps team , you will be responsible for analyzing internal business data, creating dashboards, and providing actionable insights that influence key decision-making processes. You will collaborate with cross-functional teams, including Product, Sales, Marketing, and Operations, to develop data-driven solutions and optimize business performance. This role offers the opportunity to work in a fast-paced environment with high impact and visibility. Key Responsibilities Data Analysis & Reporting: Analyze large and complex datasets to identify patterns, trends, and opportunities for business growth and operational improvements. Dashboard Development: Design, build, and maintain interactive dashboards and visualizations to provide actionable insights for stakeholders. Performance Monitoring: Track key performance indicators (KPIs) for various business processes, including sales pipeline, marketing campaigns, revenue, and customer success metrics. Data Quality & Governance: Ensure accuracy, completeness, and consistency of data across internal systems, identifying and resolving data discrepancies. Collaboration: Work closely with the Engineering, SalesOps, MarketingOps, and Finance teams to gather requirements, define metrics, and implement data models for key business applications. Process Optimization: Provide recommendations to streamline workflows and improve data integration across tools like CRM, ERP, marketing automation platforms, and other business applications. Predictive Insights: Support decision-making through predictive analysis using statistical techniques and machine learning models (if applicable). Who you are ? Education: Bachelor’s degree in data science, Statistics, Computer Science, Business Analytics, or a related field . Master’s degree is a plus. Experience: 4- 8 years of relevant experience in data analysis, preferably in a business applications or SaaS environment. Technical Proficiency: Strong knowledge of SQL and ability to query large databases. Experience with BI tools such as Tableau, Power BI, Looker, or Domo . Familiarity with CRM systems like Salesforce , HubSpot , or other internal business platforms. Knowledge of Excel (advanced level) and scripting languages like Python/R is a plus. Data Modeling: Experience in designing data models, building ETL pipelines, or working with data warehouses (e.g., Snowflake, Redshift). Communication Skills: Ability to translate complex data insights into clear, concise, and actionable recommendations for both technical and non-technical stakeholders. Problem-Solving: Analytical mindset with strong problem-solving abilities to address business challenges using data-driven approaches. Preferred Skills Experience in working with or managing internal business applications (e.g., CRMs, ERPs, or financial systems). Knowledge of data governance, data lineage, and best practices for managing enterprise data . Familiarity with AI/ML models and advanced data analytics techniques. Exposure to cloud data platforms (e.g., AWS, Azure, GCP). Why Join the Bizapps Team at Whatfix? Opportunity to be part of a ground-up team that directly impacts internal business operations. Work on cutting-edge data solutions in a dynamic, collaborative environment. Exposure to cross-functional collaboration with senior stakeholders. Competitive compensation and opportunities for career growth within a rapidly scaling company. Note: We strive to live and breathe our Cultural Principles and encourage employees to demonstrate some of these core values - Customer First; Empathy; Transparency; Fail Fast and scale Fast; No Hierarchies for Communication; Deep Dive and innovate; Trust, Do it as you own it. We are an equal opportunity employer and value diverse people because of and not in spite of the differences. We do not discriminate on the basis of race, religion, color, national origin, ethnicity, gender, sexual orientation, age, marital status, veteran status, or disability status Show more Show less

Posted 13 hours ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Summary We are seeking an experienced Data Architect with expertise in Snowflake, dbt, Apache Airflow, and AWS to design, implement, and optimize scalable data solutions. The ideal candidate will play a critical role in defining data architecture, governance, and best practices while collaborating with cross-functional teams to drive data-driven decision-making. Key Responsibilities Data Architecture & Strategy: Design and implement scalable, high-performance cloud-based data architectures on AWS. Define data modelling standards for structured and semi-structured data in Snowflake. Establish data governance, security, and compliance best practices. Data Warehousing & ETL/ELT Pipelines: Develop, maintain, and optimize Snowflake-based data warehouses. Implement dbt (Data Build Tool) for data transformation and modelling. Design and schedule data pipelines using Apache Airflow for orchestration. Cloud & Infrastructure Management: Architect and optimize data pipelines using AWS services like S3, Glue, Lambda, and Redshift. Ensure cost-effective, highly available, and scalable cloud data solutions. Collaboration & Leadership: Work closely with data engineers, analysts, and business stakeholders to align data solutions with business goals. Provide technical guidance and mentoring to the data engineering team. Performance Optimization & Monitoring: Optimize query performance and data processing within Snowflake. Implement logging, monitoring, and alerting for pipeline reliability. Required Skills & Qualifications 10+ years of experience in data architecture, engineering, or related roles. Strong expertise in Snowflake, including data modeling, performance tuning, and security best practices. Hands-on experience with dbt for data transformations and modeling. Proficiency in Apache Airflow for workflow orchestration. Strong knowledge of AWS services (S3, Glue, Lambda, Redshift, IAM, EC2, etc.). Experience with SQL, Python, or Spark for data processing. Familiarity with CI/CD pipelines, Infrastructure-as-Code (Terraform/CloudFormation) is a plus. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, etc.). Preferred Qualifications Certifications: AWS Certified Data Analytics – Specialty, Snowflake SnowPro Certification, or dbt Certification. Experience with streaming technologies (Kafka, Kinesis) is a plus. Knowledge of modern data stack tools (Looker, Power BI, etc.). Experience in OTT streaming could be added advantage. Show more Show less

Posted 13 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies