Jobs
Interviews

48 Lookml Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

15 - 30 Lacs

mumbai, bengaluru, thiruvananthapuram

Hybrid

Job Title: Senior Business Intelligence (BI) Engineer Role & Responsibilities: Design, develop, and optimize BI dashboards and reports using Looker, Power BI, and Tableau. Create and maintain LookML models, explores, dimensions, and measures for efficient data visualization. Work closely with data analysts, business stakeholders, and engineering teams to translate business needs into actionable insights. Implement row-level security (RLS), column-level security (CLS), and user access management within BI platforms. Optimize BI performance by tuning SQL queries, caching strategies, and indexing techniques. Manage and automate data pipelines, ETL processes, and integrations across multiple data sources like BigQuery, SQL Server, and Snowflake. Leverage Looker API for embedding visualizations and automating reporting workflows. Ensure data governance, quality, and compliance while maintaining documentation for BI configurations. Work with version control tools like Git to manage LookML changes efficiently. Train and support business users on self-service analytics and BI best practices. Troubleshoot and resolve issues related to BI dashboards, data models, and report accuracy. Skills & Qualifications: BI Tools: Expert in Looker (LookML), Power BI, Tableau, Sigma Computing. SQL & Databases: Strong proficiency in SQL, PL/SQL, T-SQL, working with BigQuery, SQL Server, MySQL, Oracle, and Snowflake. ETL & Data Engineering: Hands-on experience with SSIS, Azure Data Factory, IBM Data stage. Cloud Platforms: Experience working with Google Cloud Platform (GCP), Microsoft Azure. Programming & Scripting: Proficiency in Python, Shell scripting for automation and data processing. Data Modeling & Governance: Strong understanding of dimensional modeling, data lineage, and governance best practices. API & Automation: Experience integrating BI solutions with Looker API and other automation tools. Project Management & Agile: Ability to gather business requirements, track project progress, and communicate updates. Soft Skills: Strong problem-solving, stakeholder management, and communication skills.

Posted 1 week ago

Apply

2.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be responsible for developing and maintaining Looker Studio projects, building data models, creating visualizations, and ensuring the overall success of the Looker implementation. The ideal candidate should have a strong background in data analysis, SQL, and LookML, along with relevant experience in a similar role. Your responsibilities will include developing and maintaining Looker Studio projects, such as data models, explores, and visualizations. You will collaborate with cross-functional teams to understand data needs and transform them into effective Looker features. Writing complex LookML code to model and manipulate data in Looker, creating and updating documentation for Looker projects, troubleshooting issues, and optimizing performance will also be part of your duties. Staying informed about Looker releases and new features, providing training and support to end-users are also key aspects of this role. To qualify for this position, you should have at least 2 years of experience in a Looker Studio Developer or similar role, possess strong knowledge of SQL and data modeling, and have hands-on experience with LookML and Looker Studio development. Proficiency in data visualization and dashboarding tools, as well as strong analytical and problem-solving skills, will be essential. Excellent communication and collaboration skills, the ability to work independently, manage multiple projects concurrently, and familiarity with Agile development methodologies would be advantageous. If you meet the requirements and are looking for a challenging opportunity as a Looker Studio Pro Developer, we invite you to apply.,

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Overview of 66degrees 66degrees is a leading consulting and professional services company specializing in developing AI-focused, data-led solutions leveraging the latest advancements in cloud technology. With our unmatched engineering capabilities and vast industry experience, we help the world&aposs leading brands transform their business challenges into opportunities and shape the future of work. At 66degrees, we believe in embracing the challenge and winning together. These values not only guide us in achieving our goals as a company but also for our people. We are dedicated to creating a significant impact for our employees by fostering a culture that sparks innovation and supports professional and personal growth along the way. Overview of Role We are looking for a highly motivated and experienced Data Analytics Engineer to join our team. As a Data Analytics engineer, you will be responsible for building and implementing robust data analytics solutions using leading Business Intelligence platforms such as Looker and/or PowerBI. You will collaborate with cross-functional teams to gather requirements, design scalable architectures, and deliver high-quality solutions that meet business needs. Responsibilities Work with Clients to enable them on leading business intelligence platforms (Looker, MicroStrategy, PowerBI), teaching them how to construct an analytics ecosystem in Looker from the ground up. Advise clients on how to develop their analytics centers of excellence, defining and designing processes to promote a scalable, governed analytics ecosystem. Utilize business intelligence platforms, to design and develop interactive and visually appealing dashboards and reports for end-users. Write clean, efficient, and scalable code (LookML, DAX, MDX, or similar) Conduct performance tuning and optimization of data analytics solutions to ensure efficient processing and query performance. Stay up to date with the latest trends and best practices in cloud data analytics, big data technologies, and data visualization tools. Collaborate with other teams to ensure seamless integration of data analytics solutions with existing systems and processes. Provide technical guidance and mentorship to junior team members, sharing knowledge and promoting best practices. Qualifications 3+ years of experience as a Data Analytics Engineer or a similar role, with a focus on Looker Comprehension of security capabilities in business intelligence tools and supporting multiple personas in the platform. Strong problem-solving skills and the ability to translate business requirements into technical solutions. Excellent communication and collaboration skills with the ability to work effectively in a cross-functional team environment. Business Intelligence platform certifications (including deprecated LookML Developer) are a plus. Bachelor&aposs or Master&aposs degree in Computer Science, Information Systems, or a related field. 66degrees is an Equal Opportunity employer. All qualified applicants will receive consideration for employment without regard to actual or perceived race, color, religion, sex, gender, gender identity, national origin, age, weight, height, marital status, sexual orientation, veteran status, disability status or other legally protected class. Show more Show less

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

At Logic, we specialize in resolving retail challenges by adopting a consumer-centric approach to address business issues through retail systems integration. Our unique methodology involves more than just installing and updating software; we equip our clients with the knowledge and assistance required to deliver exceptional shopping experiences that enhance convenience, efficiency, and profitability. By joining our global organization at the forefront of retail, technology, and innovation, you can leverage your skills and enthusiasm in a dynamic environment. Collaborate with a diverse team of skilled colleagues and supportive leaders who are dedicated to your success. What you'll be responsible for: - Developing and maintaining Looker data models (LookML) and reports based on business needs. - Possessing a strong understanding of ETL and Datawarehousing and being proficient in hands-on execution. - Improving existing dashboards for enhanced performance, user-friendliness, and visual appeal. - Crafting and optimizing Looker dashboards and visualizations to facilitate data analysis and derive business insights. - Leading the design, construction, and management of new Looker dashboards. - Establishing, sustaining, and enforcing coding standards and style guides for LookML development. - Communicating effectively with both technical and non-technical audiences. - Providing training and assistance to end-users on utilizing visualization tools and interpreting data insights. What we expect from you: - A degree in Computer Engineering or a related field, or equivalent experience. - Proficiency in LookML, ETL, and Datawarehousing. Why choose to work with us - We prioritize your physical, mental, and financial well-being by offering competitive benefits, a flexible work environment, and vacation policy. - We are committed to our employees" growth and development through career coaching, mentorship, and a focus on strengths, progress, and career advancement. - We provide ample opportunities to enhance your skills across various client projects. - We value your input and interests, offering opportunities to contribute meaningfully, from volunteering to leading internal initiatives.,

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

karnataka

On-site

You have 5-10 years of experience and will be based in Hyderabad. Your role involves designing, developing, and maintaining advanced Looker dashboards on GCP to ensure they are user-friendly and offer actionable insights. You will create custom visualizations using JavaScript and collaborate with stakeholders to gather requirements and translate them into technical specifications. This includes integrating data from various sources with accuracy and consistency. Your responsibilities also include developing and optimizing LookML models for efficient data querying and visualization, implementing data governance, and data management best practices within Looker. You will monitor and optimize the performance of Looker dashboards, troubleshoot issues related to performance and data accuracy, and provide training and support to end-users. Additionally, you will communicate complex technical concepts to non-technical stakeholders, stay updated on the latest trends in data visualization, and identify opportunities to enhance existing dashboards and develop new ones. You will manage code changes using version control, validate data using Looker's SQL Runner, and review SQL queries and data structures to maintain data accuracy and integrity within Looker. Your role also involves creating compelling dashboards and visualizations that facilitate data-driven decision-making, collaborating with data engineering teams to ensure smooth integration between databases and Looker. You will contribute to the development of standard operating procedures and documentation while building strong data validation processes to ensure data accuracy and integrity within Looker.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You will be responsible for developing and maintaining reporting tools, particularly Looker, as the Reporting Engineer. Having knowledge of Looker is advantageous for this role. You will serve as the technical expert, engaging in day-to-day development alongside support engineers. Collaborating with various stakeholders such as data analysts, data scientists, product managers, and business stakeholders is essential to design the overall instance architecture efficiently, minimizing duplicate work, optimizing speed, and ensuring a clear, streamlined end-user experience. Your role will involve ensuring the delivery of reliable software and data pipelines using data engineering best practices, including secure automation, version control, continuous integration/delivery, and proper testing. A strong commitment to teamwork, as well as excellent business and interpersonal skills, will be key in this position. As an integral part of the growing data engineering and analytics team, you will be responsible for shaping the technological and architectural vision. The ideal candidate for this position should have a minimum of 3 years of practical experience in data reporting and engineering for business intelligence platforms, with a focus on Looker. A bachelor's degree in computer science, data science, engineering, physical sciences, or a related field is required. Proficiency in SQL and experience with distributed source control systems like GIT in an Agile-Scrum environment are essential. Familiarity with LookML and Python is advantageous. Furthermore, experience in BI administration tasks such as cloud setup, instance setup and upgrading, backup creation, and disaster recovery is desirable. A strong understanding of dimensional modeling and data warehousing methodologies is expected. You should excel in identifying root-cause issues related to data discrepancies and be able to optimize SQL queries to minimize processing time required for generating insights. Staying updated with the latest data trends and simplifying data insights should be a priority. If you have a passion for data and the valuable insights that large datasets can offer, this role is an excellent opportunity for you. Experience in retail analytics would be a plus.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a skilled Full Stack Developer at our company, you will be an integral part of the Product Engineering team. Your main responsibilities will include designing and developing robust applications, working closely with teams across different locations to deliver high-quality solutions within specified timelines. It is essential to keep yourself updated with the latest technologies and design principles to excel in this role. Your day-to-day tasks will involve designing and developing technical solutions based on specific requirements, building and maintaining enterprise-grade SaaS software using Agile methodologies, contributing to performance tuning and optimization efforts, creating and executing unit tests for product components, participating in peer code reviews, and ensuring high quality, scalability, and timely project completion. You will primarily be working with technologies such as Golang/Core Java, J2EE, Struts, Spring, client-side scripting, Hibernate, and various databases to build scalable core-Java applications, web applications, and web services. To be successful in this role, you should have a Bachelor's degree in Engineering, Computer Science, or equivalent experience, a solid understanding of data structures, algorithms, and their applications, hands-on experience with Looker APIs, dashboards, and LookML, strong problem-solving skills, and analytical reasoning. You must also have experience in building microservices with Golang/Spring Boot (Spring Cloud, Spring Data), developing and consuming REST APIs, profiling applications, working with front-end frameworks like Angular or Vue, and be proficient in the Software Development Life Cycle (SDLC). Additionally, familiarity with basic SQL queries and experience with Java Spring Boot, Kafka, SQL, Linux, Apache, and Redis is required. Experience with AWS cloud technologies (Go, Python, MongoDB, Postgres, ClickHouse) will be considered a plus. Excellent written and verbal communication skills are also essential for this role.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The role of Reporting Engineer - MSTR based in Pune on a contract basis involves the development and maintenance of reporting tools, primarily focusing on MicroStrategy. Familiarity with Looker is considered advantageous. As a technical expert, you will engage in day-to-day development activities alongside support engineers. Your responsibilities will include assisting in designing the overall instance architecture to enhance efficiency, reduce redundant work, and ensure a seamless end-user experience. Collaboration with various stakeholders such as data analysts, data scientists, product managers, and business representatives is essential. Ensuring the delivery of reliable software and data pipelines utilizing best practices in data engineering, including secure automation, version control, continuous integration/delivery, and thorough testing, is a key aspect of the role. Strong teamwork skills, as well as effective business and interpersonal abilities, are crucial requirements. As an integral part of the expanding data engineering and analytics team, you will play a pivotal role in shaping our technological and architectural direction. The ideal candidate for this position should possess a minimum of 3 years of practical experience in data reporting and engineering within business intelligence platforms, particularly in MicroStrategy. A bachelor's degree in computer science, data science, engineering, physical sciences, or a related field is expected. Proficiency in SQL and distributed source control systems like GIT in an Agile-Scrum environment is required. Experience with LookML and Python is a bonus. Familiarity with BI administration tasks such as cloud setup, instance configuration and updates, backup management, and disaster recovery is preferred. A solid understanding of dimensional modeling and data warehousing methodologies is essential. The ability to identify root-cause issues related to data discrepancies and optimize SQL queries directly or through BI-tool generation to enhance processing efficiency for insights is necessary. Keeping abreast of the latest data trends and seeking ways to simplify data insights is encouraged. A genuine passion for data and the valuable insights that extensive datasets can offer is highly valued. Prior experience in retail analytics is considered advantageous.,

Posted 2 weeks ago

Apply

10.0 - 12.0 years

0 Lacs

bengaluru, karnataka, india

On-site

Headquartered in Dublin, Ohio, Cardinal Health, Inc. (NYSE: CAH) is a global, integrated healthcare services and products company connecting patients, providers, payers, pharmacists and manufacturers for integrated care coordination and better patient management. Backed by nearly 100 years of experience, with more than 50,000 employees in nearly 60 countries, Cardinal Health ranks among the top 20 on the Fortune 500. Department Overview Advanced Analytics and Automation buildsautomation, analytics andartificial intelligence solutions that drive successfor Cardinal Healthby creating material savings, efficiencies and revenue growthopportunities. The teamdrives business innovation byleveragingemerging technologies and turning them into differentiating business capabilities. This role will be responsible for data modeling, data engineering, and stack development. Key Responsibilities: Develop and maintain data engineering solutions and pipelines. Identify and implement automation opportunities and be able to conduct data discovery. Collaborate with other team members to ensure data quality and consistency. Provide technical guidance and support to other team members. Participate and contribute in technical platform strategy as tools, products, and business needs evolve. Define and execute database and data movement standards, design reviews, database implementation. Analyzing, re-architecting and re-platforming legacy database objects to GCP BigQuery. Define how our data analytics capabilities will apply to business needs and result in dependable business solutions. Ensure technical specifications are aligned with both business needs and technical design standards. Generate ideas and suggestions for process and technical improvements for platforms and processes supported by the team. Ensure platforms and tools meet or exceed data security standards, including internal and external audits performed. Develop best practices for solution and tool frameworks, leveraging standard naming conventions, scripting, and coding practices to ensure consistency of data solutions Required Skills: Overall experience of 10+ years and atleast 2+ years of experience as technical architect/solution lead Strong programming skills in Python. A deep understanding and multi-process architecture and the threading limitations of Python Proficiency with SQL and relational databases Experience building and consuming APIs to work with other services using REST. API Gateway Integration: Understand how to integrate and manage APIs through API gateways (Apigee preferred) RESTful API principles: Understanding REST architecture, HTTP methods, status codes, response formats (JSON, XML) Knowledge of tools like Swagger or OpenAPI for creating and maintaining API documentation Authentication & Security experience such as OAuth2, JWT, handling API keys from various LOB (line of business) systems Understanding of Product Lifecycle development and clear understanding of automation platform technologies. Ability to integrate multiple data sources into a single system Experience with data platforms including GCP BigQuery Expert knowledge and experience with building database objects like SQL, Stored Procs and Views.. Hands-on with building semantic layer on consumable data using AtScale and/or LookML. Ability to technically lead and mentor development team members. Expert level knowledge with database partitioning, indexes, hints etc. Experience with business-critical applications. Experience on large-scale implementation programs preferred. Excellent written and oral communication skills. Agile development skills and experience. Experience with build and maintenance DevSecOps practices (CI/CD Pipeline, Airflow, GitHUB etc). Proficiency with SQL and relational databases such as PostgreSQL. Candidates who are back-to-work, people with disabilities, without a college degree, and Veterans are encouraged to apply. Cardinal Health supports an inclusive workplace that values diversity of thought, experience and background. We celebrate the power of our differences to create better solutions for our customers by ensuring employees can be their authentic selves each day. Cardinal Health is an Equal Opportunity/Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, ancestry, age, physical or mental disability, sex, sexual orientation, gender identity/expression, pregnancy, veteran status, marital status, creed, status with regard to public assistance, genetic status or any other status protected by federal, state or local law.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Looker Data Engineer/Architect at myKaarma, you will be instrumental in shaping and scaling our modern data lake architecture built around BigQuery and Looker. Your role will involve designing and implementing Looker Views, Explores, and Dashboards, collaborating closely with data stakeholders to provide accurate and business-relevant insights. You will play a crucial part in transitioning our existing data architecture into LookML, driving modeling and visualization best practices within the organization, and identifying areas for improvement in our data lake models. Furthermore, this position offers you the opportunity to integrate AI/ML into our data lake, enabling you to deliver intelligent insights and recommendations to both internal and external customers. Your responsibilities will include designing and developing LookML models, views, and explores based on our legacy data warehouse, creating and maintaining high-quality dashboards and visualizations in Looker, and collaborating with cross-functional teams to translate requirements into scalable data models. You will be expected to guide other team members on building and maintaining Looker dashboards, ensuring data accuracy and performance across Looker and BigQuery resources, and proactively enhancing the Looker platform's structure and usability. To excel in this role, you should possess at least 5 years of experience in data engineering and a minimum of 2 years of hands-on experience with Looker, including LookML modeling and dashboard development. Strong expertise in Google BigQuery, project management in Google Cloud, implementing data governance in Looker, and managing end-to-end Looker projects independently are essential qualifications. Proficiency in SQL, familiarity with source control systems like git, excellent communication skills, and the ability to thrive in a fast-paced, collaborative environment are desired attributes. While familiarity with batch processing, stream processing, real-time analytics, and MySQL queries is advantageous, individuals with diverse experiences and backgrounds are encouraged to apply even if they do not meet all listed qualifications. At myKaarma, we value inclusivity and diversity, fostering a workplace where all employees are empowered to contribute meaningfully and feel valued. Our commitment to competitive compensation and comprehensive benefits underscores our dedication to supporting both personal and professional well-being, including a flexible work environment, health and wellness benefits, generous time off, and in-office perks. If you require reasonable accommodations during the application or interview process due to a disability, please let us know. Join us at myKaarma, where your unique perspective can drive innovation and success in delivering exceptional solutions to our clients.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As a Looker BI Developer at iO Associates in Hyderabad, you will play a key role in transforming raw data into powerful insights for ecommerce businesses. You will be part of a fast-growing analytics firm that focuses on helping clients make data-driven decisions. Your primary responsibility will be to design scalable Looker models and dashboards, develop LookML structures for accurate data representation, and create custom reports and visualizations. Additionally, you will optimize multi-source data for analysis, ensure data security and compliance, identify and resolve performance issues, and maintain technical documentation. To excel in this role, you should have a degree in Computer Science, Information Systems, or a related field. You must possess expertise in Looker, including data modeling, LookML, and dashboard creation. Strong SQL skills with experience in query optimization are essential, along with knowledge of data governance and security best practices. Experience with Tableau and Power BI will be an added advantage. If you are ready to make an impact and contribute to building innovative analytics solutions, we invite you to connect with us by sending your CV. Join us in creating something great together! (Note: This job is a 6-month contract with a hybrid work model of 3 days onsite in Hyderabad.),

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

hyderabad, chennai, bengaluru

Hybrid

Job Summary We are seeking a highly skilled and experienced BI Developer with a strong background in Looker. The ideal candidate will have at least 4.5 to 10 years of total IT experience in Business Intelligence space with a minimum of 3 years of hands-on experience with Looker and few years with Power BI / other reporting tools like Qlik / Tableau. Key Responsibilities Design, develop, and deploy BI solutions using Power BI. Collaborate with business stakeholders to gather requirements and translate them into effective dashboards and reports. Develop and optimize LookML models , explores and dashboards in Looker. Create complex DAX queries, data models and interactive dashboards in Power BI. Integrate data from multiple sources, ensuring data quality and consistency. Provide expertise in data visualization best practices and storytelling techniques. Perform data analysis and deliver actionable insights to business teams. Stay updated with the latest BI trends, tools and best practices. Flexible to work as an individual contributor. Required Skills & Qualifications Total IT Experience: 4.5 years Looker: 3 years of hands-on experience (LookML, dashboarding, data modelling) Power BI: 3 years of hands-on experience (DAX, Power Query, data modelling) Strong SQL skills and experience working with relational databases. Proficiency in data integration, ETL processes and data warehousing concepts. Excellent analytical, problem-solving and communication skills. Good to Have Qualifications Experience with cloud data platforms such as GCP/Azure. Knowledge of other BI tools (e.g., Tableau, Qlik Sense/QlikView) is an advantage. Experience in technical consulting is considered an added advantage. BI certifications in Looker (e.g., Looker Business Analyst, LookML Developer) or Power BI (e.g., PL-300).

Posted 3 weeks ago

Apply

6.0 - 11.0 years

7 - 16 Lacs

bengaluru

Work from Office

Job Title: BI Solution Developer Function: Looker, SQL, Snowflake, Data Analysis, Data Warehousing, DevOps, Agile Location: Bangalore Employment Type: FTE Key Responsibilities Looker Development & Administration Must have strong SQL skill Design and develop LookML models, explores, dashboards, and workflows in the Looker platform. Implement persistent derived tables (PDTs), caching policies, version control, and perform code quality assessments. Utilize SQL Runner for data validation and ensure accuracy of reports and dashboards. Maintain and administer Looker as the primary BI tool and promote its effective use across the organization. Investigate and optimize query performance for enhanced user experience. Partner with stakeholders to design and adopt data integrity processes, self-service analytics frameworks, and strong data governance practices. Build strong KPI-focused relationships with product, engineering, and business teams. Guide and mentor peers on BI tools, data analytics processes, and project best practices.

Posted 3 weeks ago

Apply

9.0 - 13.0 years

30 - 45 Lacs

bengaluru

Remote

Lead Data Engineer - What You Will Do: As a PR3 Lead Data Engineer, you will be instrumental in driving our data strategy, ensuring data quality, and leading the technical execution of a small, impactful team. Your responsibilities will include: Team Leadership: Establish the strategic vision for the evolution of our data products and our technology solutions, then provide technical leadership and guidance for a small team of Data Engineers in executing the roadmap. Champion and enforce best practices for data quality, governance, and architecture within your team's work. Embody a product mindset over the teams data. Oversee the team’s use of Agile methodologies (e.g., Scrum, Kanban), ensuring smooth and predictable delivery, and overtly focusing on continuous improvement. Data Expertise & Domain Knowledge: Actively seek out, propose, and implement cutting-edge approaches to data transfer, transformation, analytics, and data warehousing to drive innovation. Design and implement scalable, robust, and high-quality ETL processes to support growing business demand for information, delivering data as a reliable service that directly influences decision making. Develop a profound understanding and "feel" for the business meaning, lineage, and context of each data field within our domain. Communication & Stakeholder Partnership: Collaborate with other engineering teams and business partners, proactively managing dependencies and holding them accountable for their contributions to ensure successful project delivery. Actively engage with data consumers to achieve deep understanding of their specific data usage, pain points, and current gaps, then plan initiatives to implement improvements collaboratively. Clearly articulate project goals, technical strategies, progress, challenges, and business value to both technical and non-technical audiences. Produce clear, concise, and comprehensive documentation. Your Qualifications: At Vista, we value the experience and potential that individual team members add to our culture. Please don’t hesitate to apply even if you don’t meet the exact qualifications, we look forward to learning more about you! Bachelor's or Master's degree in computer science, data engineering, or a related field . 10+ years of professional experience, with at least 6 years of hands-on Data Engineering, specifically in e-commerce or direct to consumer, and 4 years of team leadership Demonstrated experience in leading a team of data engineers, providing technical guidance, and coordinating project execution Stakeholder management experience and excellent communication skills Strong knowledge of SQL and data warehousing concepts is a must Strong knowledge of Data Modeling concepts and hands-on experience designing complex multi-dimension data models Strong hands-on experience in designing and managing scalable ETL pipelines in cloud environments with large volume datasets (both structured/unstructured data) Proficiency with cloud services in AWS (Preferred), including S3, EMR, RDS, Step Functions, Fargate, Glue etc. Critical hands-on experience with cloud-based data platforms (Snowflake strongly preferred) Data Visualization experience with reporting and data tools (preferably Looker with LookML skills) Coding mastery in at least one modern programming language: Python (strongly preferred), Java, Golang, PySpark, etc. Strong knowledge in production standards such as versioning, CI/CD, data quality, documentation, automation, etc. Problem solving and multi-tasking ability in a fast-paced, globally distributed environment Nice To Have: Experience with API development on enterprise platforms, with GraphQL APIs being a clear plus Hands-on experience designing DBT data pipelines Knowledge of finance, accounting, supply chain, logistics, operations, procurement data is a plus Experience managing work in Jira and writing documentation in Confluence Proficiency in AWS account management, including IAM, infrastructure, and monitoring for health, security and cost optimization Experience with Gen AI/ML tools for enhancing data pipelines or automating analysis. Why You'll Love Working Here There is a lot to love about working at Vista. We are an award winning Remote-First company. We’re an inclusive community. We’re growing (which means you can too). And to help orient us all in the same direction, we have our Vista Behaviors which exemplify the behavioral attributes that make us a culturally strong and high-performing team. Our Team: Enterprise Business Solutions Vistas Enterprise Business Solutions (EBS) domain is working to make our company one of the most data-driven organizations to support Finance, Supply Chain, and HR functions. The cross-functional team includes product owners, analysts, technologists, data engineers and more – all focused on providing Vista with cutting-edge tools and data we can use to deliver jaw-dropping customer value. EBS team members are empowered to learn new skills, communicate openly, and be active problem-solvers. Join our EBS Domain as a Lead Data Engineer! This Lead level within the organization will be responsible for the work of a small team of data engineers, focusing not only on implementations but also operations and support. The Lead Data Engineer will implement best practices, data standards, and reporting tools. The role will oversee and manage the work of other data engineers as well as being an individual contributor. This role has a lot of opportunity to impact general ETL development and implementation of new solutions. We will look to the Lead Data Engineer to modernize data technology solutions in EBS, including the opportunity to work on modern warehousing, finance, and HR datasets and integration technologies. This role will require an in-depth understanding of cloud data integration tools and cloud data warehousing, with a strong and pronounced ability to lead and execute initiatives to tangible results.

Posted 4 weeks ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

hyderabad

Remote

Job Title : Looker Architect Note: We are looking for someone who can run discover workshop to understand current domo models and define scope for converting to looker. Job Description : We are seeking a Looker Architect with proven expertise in running discovery workshops, analyzing existing Domo BI models , and defining the scope for migration to Looker . The ideal candidate will be responsible for understanding current BI landscape, creating optimized Looker data models, and leading the migration roadmap. Responsibilities : Conduct workshops with stakeholders to analyze existing Domo models. Define scope, strategy, and roadmap for Domo-to-Looker migration. Design, develop, and optimize Looker models (LookML, Explores, Dashboards). Collaborate with data engineering teams to ensure robust integration with cloud DWH. Ensure best practices in BI architecture, governance, and performance. Provide guidance to developers and business users on Looker adoption. Key Skills : Strong expertise in Looker (LookML, Explores, Dashboards) . Experience in BI migration projects (Domo Looker highly preferred). Strong SQL and data modeling expertise. Cloud Data Warehouse (Snowflake / BigQuery / Redshift). Workshop facilitation & stakeholder management.

Posted 4 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a skilled Looker Developer with 5-7 years of experience, proficient in LookML, BigQuery, and ReactJS. Your role involves developing and optimizing Looker dashboards, models, and explores using LookML, building custom Looker experiences and embedded analytics with ReactJS, designing and maintaining ETL/ELT pipelines on Google Cloud Platform (GCP), writing complex SQL queries for data transformation and reporting, collaborating with stakeholders to convert business needs into data solutions, and visualizing insights using Looker. You must have strong SQL and data modeling skills, experience with ReactJS for custom data apps or Looker extensions, familiarity with GCP tools, and excellent analytical and communication skills. If you are looking for a challenging opportunity to work on end-to-end BI solutions and deliver scalable and insightful analytics, HCLTech is hiring for the position of Looker Engineer with React.JS experience. The job location is Pan India (Noida/BLR/Pune/HYD/CHN) with a notice period of 30 days or less. Interested candidates can apply by sending their CV to javeriya_kewal@hcltech.com with the following details: - Name - Contact - Mail id - Current company - Total Experience - Relevant Experience - Current CTC - Expected CTC - Current Location - Preferred Location - Notice period - Offers in hand, if any.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

You are a skilled Full Stack Developer who will be joining our Product Engineering team. In this role, you will be responsible for designing and developing robust applications, collaborating with multi-location teams, and delivering high-quality solutions within agreed timelines. It is essential to stay current with new technologies and design principles to succeed in this position. Your responsibilities will include designing and developing technical solutions based on requirements, building and maintaining enterprise-grade SaaS software using Agile methodologies, contributing to performance tuning and optimization efforts, executing comprehensive unit tests for product components, participating in peer code reviews, and championing high quality, scalability, and timely project completion. You will be utilizing technologies such as Golang/Core Java, J2EE, Struts, Spring, client-side scripting, Hibernate, and various databases to build scalable core-Java applications, web applications, and web services. To qualify for this role, you should have a Bachelor's degree in Engineering, Computer Science, or equivalent experience. Additionally, you must possess a solid understanding of data structures, algorithms, and their applications, hands-on experience with Looker APIs, dashboards, and LookML, strong problem-solving skills, and analytical reasoning. Experience in building microservices with Golang/Spring Boot, developing and consuming REST APIs, profiling applications, using at least one front-end framework (e.g., Angular or Vue), and familiarity with basic SQL queries is required. Excellent written and verbal communication and presentation skills, a good understanding of the Software Development Life Cycle (SDLC), and proven software development experience with Java Spring Boot, Kafka, SQL, Linux, Apache, and Redis are essential. Experience with AWS cloud technologies (Go, Python, MongoDB, Postgres, ClickHouse) would be a plus.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Are you prepared to make a significant impact on finance data transformation at J.P. Morgan Join our dynamic team in Bengaluru, where your skills will drive meaningful solutions and influence the future of finance analytics. We provide exceptional opportunities for career advancement, a collaborative atmosphere, and exposure to cutting-edge technology. As an Analytics Solutions Associate at JP Morgan Chase in the Chase UK team, you will have a crucial role in our finance data transformation initiative. Your responsibilities will involve collaborating with finance stakeholders to create a finance data platform that supports critical business control functions and decision-making processes. Using modern data tools, you will deliver strategic solutions and insights, contributing to our global footprint. You will: - Collaborate with finance stakeholders to comprehend business requirements and transform them into data needs and BI solutions. - Design, develop, and manage interactive dashboards and reports utilizing Google Looker and LookML. - Utilize SQL for data querying and analysis to ensure accuracy and integrity. - Work alongside data engineers and modelers to create necessary data models. - Present intricate data in visually appealing formats to facilitate communication. - Brainstorm and refine solutions in collaboration with product managers and end users. - Maintain compliance with BI solutions governance protocols. Qualifications, Skills, and Requirements: - Minimum of 3 years of experience using business intelligence platforms. - Proficiency in Looker and LookML. - Bachelor's degree or equivalent in data analytics, computer science, accounting, or finance. - Experience in large-scale digital transformation projects, preferably in Finance/FinTech. - Solid understanding of SQL and data modeling. - Expertise in data analytics and visualization, with a strong grasp of UX/UI design. - Exceptional written and verbal communication skills. - High self-motivation and a proactive approach. Preferred Qualifications, Skills, and Capabilities: - Business acumen in retail banking products like current accounts, credit cards, insurance, and mortgages. - Enthusiasm for innovation and problem-solving, emphasizing user-centric design. - Ability to work effectively in cross-functional agile teams. #ICBCareer,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Looker Engineer with React.JS experience at HCLTech, you will be responsible for developing and optimizing Looker dashboards, models, and explores using LookML. Your role will involve building custom Looker experiences and embedded analytics using ReactJS, as well as designing and maintaining ETL/ELT pipelines on Google Cloud Platform (GCP), specifically BigQuery. You will be required to write complex SQL queries for data transformation and reporting, collaborate with stakeholders to translate business requirements into data solutions, and visualize insights using Looker. Your key responsibilities will include developing and optimizing Looker dashboards, models, and explores using LookML, building custom Looker experiences and embedded analytics utilizing ReactJS, designing and maintaining ETL/ELT pipelines on Google Cloud Platform (GCP), particularly BigQuery, writing complex SQL queries for data transformation and reporting, collaborating with stakeholders to interpret business needs into data solutions, and visualizing insights using Looker. To succeed in this role, you must have proficiency in Looker (LookML) and BigQuery, strong SQL and data modeling skills, experience with ReactJS for custom data applications or Looker extensions, familiarity with GCP tools, as well as excellent analytical and communication skills. If you have the required skills and experience, and you are interested in joining our team, please send your CV to javeriya_kewal@hcltech.com. Kindly include the following details in your email: - Name - Contact - Mail id - Current company - Total Experience - Relevant Experience - Current CTC - Expected CTC - Current Location - Preferred Location - Notice period - Any current offer in hand We look forward to hearing from you and potentially welcoming you to our team at HCLTech.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

Dear Candidate, HCL Tech is currently seeking a Looker Developer for PAN India. We are looking for individuals who can immediately join or within 10 to 15 days and possess a strong match with the job description provided below. As a Looker Developer, your main responsibility will be to design, develop, and maintain Looker dashboards and reports to fulfill the business intelligence requirements. In this role, you will closely collaborate with data engineers, analysts, and business stakeholders to ensure data accuracy and deliver valuable insights. Key Responsibilities: - Dashboard Development: Create and manage Looker dashboards and reports according to business needs. - Data Modeling: Develop and enhance LookML models for efficient data retrieval and visualization. - Collaboration: Engage with business stakeholders to comprehend their data requirements and translate them into technical solutions. - Performance Optimization: Optimize Looker queries and performance for streamlined data processing. - Documentation: Document data models, ETL processes, and Looker configurations. - Troubleshooting: Identify and resolve data quality issues and report accuracy discrepancies. Required Skills: - Proficiency in Looker: Demonstrated experience in developing Looker dashboards and reports. - LookML Expertise: Advanced skills in LookML for data modeling and visualization. - SQL: Strong SQL skills for data querying and manipulation. - ETL Tools: Familiarity with ETL tools and processes. - Communication: Excellent communication skills to interact effectively with stakeholders and team members. - Agile Methodologies: Knowledge of agile development practices. Preferred Qualifications: - Total Experience: Minimum 6 years of overall experience. - Experience: 3-4 years of specific experience in Looker development. - Education: Bachelor's degree in computer science, Information Technology, or related field. - Certifications: Looker certifications are advantageous. - Additional Tools: Experience with other BI tools such as Tableau or Power BI is a plus. Kindly consider applying if you meet the job requirements and are enthusiastic about contributing to our team. Thank you for your interest.,

Posted 1 month ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Dear Candidate, HCL Tech hiring for Looker developer for PAN IDNIA. Kindly share your application in case your profile is a good match with JD. Job Title: Looker Developer Job Description: Pls apply only if you can join immediately or within 10 to 15 days. As a Looker Developer, you will be responsible for designing, developing, and maintaining Looker dashboards and reports to support business intelligence needs. You will work closely with data engineers, analysts, and business stakeholders to ensure data accuracy and deliver actionable insights. Key Responsibilities: Dashboard Development: Create and maintain Looker dashboards and reports to meet business requirements. Data Modeling: Develop and optimize LookML models to ensure efficient data retrieval and visualization. Collaboration: Work with business stakeholders to understand their data needs and translate them into technical solutions. Performance Optimization: Optimize Looker queries and performance to ensure efficient data processing. Documentation: Document data models, ETL processes, and Looker configurations. Troubleshooting: Identify and resolve issues related to data quality and report accuracy. Required Skills: Proficiency in Looker: Strong experience in developing Looker dashboards and reports. LookML Expertise: Advanced skills in LookML for data modeling and visualization. SQL: Advanced SQL skills for querying and manipulating data. ETL Tools: Experience with ETL tools and processes. Communication: Excellent communication skills to interact with stakeholders and team members. Agile Methodologies: Familiarity with agile development practices. Preferred Qualifications: Overall experience 6+ years Experience: 3-4 years of experience in Looker development. Education: Bachelor&aposs degree in computer science, Information Technology, or a related field. Certifications: Looker certifications are a plus. Additional Tools: Experience with other BI tools like Tableau or Power BI is beneficial. Show more Show less

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

You should have strong expertise in Looker and LookML along with advanced SQL skills including experience in query optimization. Proficiency in Data Warehouse (DWH) concepts and BigQuery is also required. Your excellent communication and team leadership abilities will be essential for this role. As a candidate for this position, you will be responsible for demonstrating strong expertise in Looker and LookML, utilizing advanced SQL skills for query optimization, understanding DWH concepts and working with BigQuery. Additionally, your excellent communication and team leadership abilities will be crucial for effective collaboration within the team. You should hold a Bachelor's degree in Engineering with a focus on strong communication skills to excel in this role.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

You will be joining our team as a Looker Enterprise Dashboarding Specialist. Your main responsibility will be to design, develop, and optimize Looker dashboards to extract actionable insights from complex datasets. To excel in this role, you should have a solid understanding of LookML, data modeling, SQL, and data visualization best practices. You will collaborate with data analysts, engineers, and business stakeholders to create impactful reports and dashboards. Your key responsibilities will include designing, developing, and maintaining Looker dashboards and reports to support business decision-making. You will also be tasked with building and optimizing LookML models, explores, and views to ensure efficient data querying. Collaborating with data engineering teams to enhance data pipelines and model performance will be essential. Working closely with business stakeholders to comprehend reporting needs and convert them into scalable Looker solutions is also a crucial part of your role. Implementing best practices for data visualization to ensure clear and effective storytelling will be a key aspect. Furthermore, optimizing dashboard performance, developing and maintaining data governance standards for Looker usage, and conducting training sessions for internal teams to enhance self-service analytics adoption will fall under your responsibilities. Staying abreast of Looker updates, new features, and industry best practices is also expected. To qualify for this position, you should have 3-5 years of experience in data visualization, business intelligence, or analytics. Strong expertise in Looker, LookML, and SQL is a must. Experience in data modeling, familiarity with BigQuery or other cloud data warehouses, understanding of data governance, security, and role-based access control in Looker, ability to optimize dashboards for performance and usability, strong problem-solving and analytical skills with attention to detail, and excellent communication and stakeholder management skills are necessary. Preferred qualifications include experience working with ETL pipelines and data transformation processes, familiarity with Python or other scripting languages for data automation, exposure to Google Cloud Platform (GCP) and data engineering concepts, and certifications in Looker, Google Cloud, or related BI tools.,

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You will be responsible for designing, developing, and optimizing interactive dashboards using Looker and LookML. This includes building LookML models, explores, and derived tables to meet business intelligence needs. You will create efficient data models and queries using BigQuery and collaborate with data engineers, analysts, and business teams to translate requirements into actionable insights. Implementing security and governance policies within Looker to ensure data integrity and controlled access will also be part of your role. Additionally, you will leverage GCP services to build scalable and reliable data solutions and optimize dashboard performance using best practices in aggregation and visualization. Maintaining, auditing, and enhancing existing Looker dashboards, reports, and LookML assets, as well as documenting dashboards, data sources, and processes for scalability and ease of maintenance, are critical tasks. You will also support legacy implementations and facilitate smooth transitions, build new dashboards and visualizations based on evolving business requirements, and work closely with data engineering teams to define and validate data pipelines for timely and accurate data delivery. To qualify for this role, you should have at least 6 years of experience in data visualization and BI, particularly using Looker and LookML. Strong SQL skills with experience optimizing queries for BigQuery are required, along with proficiency in Google Cloud Platform (GCP) and related data services. An in-depth understanding of data modeling, ETL processes, and database structures is essential, as well as familiarity with data governance, security, and role-based access in Looker. Experience with BI lifecycle management, strong communication and collaboration skills, good storytelling and user-centric design abilities, and exposure to the media industry (OTT, DTH, Web) handling large datasets are also necessary. Knowledge of other BI tools like Tableau, Power BI, or Data Studio is a plus, and experience with Python or other scripting languages for automation and data transformation is desirable. Exposure to machine learning or predictive analytics is considered an advantage.,

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Looker Platform Developer, you will play a crucial role in our team by transforming data into valuable insights. Your primary responsibility will be to develop and enhance LookML models, dashboards, and workflows on the Looker platform. Collaborating closely with data engineers and analysts, you will ensure that our data is visualized accurately and effectively. Your key responsibilities will include creating LookML models to address business reporting and analytics requirements, constructing Looker dashboards and workflows to provide actionable insights, and implementing best practices for performance optimization, such as caching and data modeling. It will also be essential for you to manage code changes using version control, validate data through Looker's SQL Runner, and engage in collaborative efforts with various teams to comprehend their needs. To qualify for this role, you should possess a Bachelor's degree in Computer Science or a related field, or have equivalent experience. Previous experience as a Looker Platform Developer or in a similar position is highly desirable. Additionally, you must demonstrate strong SQL and data modeling skills, familiarity with version control systems like Git, and excellent problem-solving and communication abilities. Your proficiency in workflows, LookML, SQL Runner, data validation, data modeling, version control, and communication skills will be essential in excelling as a Looker Platform Developer within our team.,

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies