Jobs
Interviews

9 Starburst Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

About the job: At Citi, we're not just building technology, we're building the future of banking. Encompassing a broad range of specialties, roles, and cultures, our teams are creating innovations used across the globe. Citi is constantly growing and progressing through our technology, with a laser focus on evolving the ways of doing things. As one of the world's most global banks, we're changing how the world does business. Shape your Career with Citi We're currently looking for a high-caliber professional to join our team as AVP- Data Engineer based in Pune, India. Being part of our team means that we'll provide you with the resources to meet your unique needs, empower you to make healthy decisions, and manage your financial well-being to help plan for your future. For instance: - We provide programs and services for your physical and mental well-being, including access to telehealth options, health advocates, confidential counseling, and more. Coverage varies by country. - We empower our employees to manage their financial well-being and help them plan for the future. - We provide access to an array of learning and development resources to help broaden and deepen your skills and knowledge as your career progresses. In this role, you're expected to: Responsibilities: - Data Pipeline Development, Design & Automation: - Design and implement efficient database structures to ensure optimal performance and support analytics. - Design, implement, and optimize secure data pipelines to ingest, process, and store large volumes of structured and unstructured data from diverse sources, including vulnerability scans, security tools, and assessments. - Work closely with stakeholders to provide clean, structured datasets that enable advanced analytics and insights into cybersecurity risks, trends, and remediation activities. Technical Competencies: - 7+ years of Hands-on experience with Scala & Hands-on experience with Spark. - 10+ years of experience in designing and developing Data Pipelines for Data Ingestion or Transformation using Spark with Scala. - Good experience in Big Data technologies (HDFS, Hive, Apache Spark, Spark-SQL, Spark Streaming, Spark jobs optimization & Kafka). - Good knowledge of Exposure to various file formats (JSON, AVRO, Parquet). - Knowledge of agile (scrum) development methodology is a plus. - Strong development/automation skills. - Right attitude to participate and contribute through all phases of Development Lifecycle. - Secondary Skillset: No SQL, Starburst, Python. - Optional: Java Spring, Kubernetes, Docker. Competencies (Soft skills): - Strong communication skills. - Candidate should be responsible for reporting to both business and technology senior management. - Need to work with stakeholders and keep them updated on developments, estimation, delivery, and issues. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity, review Accessibility at Citi. View Citi's EEO Policy Statement and the Know Your Rights poster.,

Posted 1 day ago

Apply

4.0 - 8.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Cloud Data Engineer (AWS) at Barclays, where you will be responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality, and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Cloud Data Engineer (AWS) you should have experience with AWS cloud services such as S3, Glue, Athena, Lake Formation, CloudFormation, etc. You should also have a strong SQL knowledge, proficiency in PySpark, a very good understanding of writing and debugging code, and exhibit quick learning abilities. Strong analytical and problem-solving skills are essential, along with excellent written and verbal communication skills. Some other highly valued skills may include good knowledge of Python, understanding of SCM tools like GIT, previous working experience within the banking or financial services domain, and experience with Databricks, Snowflake, Starburst, Iceberg. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role: To build and maintain the systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintain data architectures pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implement data warehouses and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures. - Develop processing and analysis algorithms fit for the intended data complexity and volumes. - Collaborate with data scientists to build and deploy machine learning models. Analyst Expectations: - Perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. - Lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. - Demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. - Develop technical expertise in the work area, acting as an advisor where appropriate. - Have an impact on the work of related teams within the area and partner with other functions and business areas. - Take responsibility for end results of a team's operational processing and activities and escalate breaches of policies/procedure appropriately. - Advise and influence decision-making within your area of expertise and manage risk and strengthen controls in relation to the work you own or contribute to. All colleagues are expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship - our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset - to Empower, Challenge, and Drive - the operating manual for how we behave.,

Posted 1 day ago

Apply

1.0 - 7.0 years

3 - 8 Lacs

Coimbatore, Tamil Nadu, India

On-site

Job description Responsibilities Key Responsibilities: Maintaining, documenting, Technology/Application Support and Service Level Agreements. Manage user access and onboarding for platforms, including Active Directory (AD) and cloud-based tools. Develop, implement, and manage IT solutions to improve visibility, automate workflows, and optimize IT operations. Work closely with the onshore/offshore/cross-functional team providing ongoing support for the Annalect technology and Business teams. Ongoing support of the Annalect technology and Business teams using existing tools or to be built tool/application sets. Strong understanding of ad platforms such as Google Ads, Meta, TikTok, Amazon DSP, DV360, The Trade Desk Etc. Must have good QA Skill to compare key advertising metrics (Clicks, Impressions, Cost, etc.) between the platform and the destination data. Documenting, implementing and managing statistics to ensure that the AOS team is operating at a highly efficient and effective level. Monitoring and handling incident response of the infrastructure, platforms, and core engineering services. Troubleshooting infrastructure, network, and application issues. Help identify and troubleshoot problems within environment. Must be able to work both independently and as a productive member in a team. Leads team projects and activities using project management methodology. It is expected that this position will require 40% process/procedure and 60% technology skills Can be available 24x7 for occasional technology / application related issues. Support robots encountering issues by taking on tickets and identifying root cause analysis. Maintain and update documentations whenever changes are made to the robots. Knowledgeable and able to support RPA infrastructure and architecture. Qualifications Required Skills 5-7 years of relevant and progressive experience as a Technology Operations Analyst, or in similar role Self-motivated and action-driven with the ability to take initiative, execute and follow-through Ability to clearly articulate technical and functional information to various audiences, both verbally and written High degree of organizational skills and ability to reprioritize based on business needs Excellent written and verbal communication skills Strong understanding of ad platform ecosystems, including campaign management, Ad Manager and Business Manager, tracking methodologies, data ingestion, and reporting workflows. Knowledge of ad operations, audience targeting, attribution models. Proficient in Excel, with demonstrated ability to organize and consolidate multiple data sources for analysis. Must possess critical thinking and exhibit problem solving skills for technical and software related issues. Has worked with a single sign-on platform inclusive of user and application setup/support. Good understanding of different methodologies such as DevOps, CICD (Continuous Integration, Continuous Delivery)/Agile/Kanban, AWS. Good working knowledge of Microsoft tools (Office, Sharepoint), CRM (JIRA, Hubspot) and reporting tools (PowerBI, Tableau etc.) Proficiency in SQL, Google BigQuery, Starburst for querying and analyzing large datasets. Strong understanding of APIs and troubleshooting. Exposure to Generative AI models (e.g., OpenAI, GPT-based models).

Posted 6 days ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

The Applications Development Technology Lead Analyst role is a senior position where you will be responsible for implementing new or updated application systems and programs in collaboration with the Technology team. Your main objective will be to lead applications systems analysis and programming activities. Your responsibilities will include partnering with various management teams to ensure the integration of functions to achieve goals, identifying necessary system enhancements for new products and process improvements, resolving high-impact problems/projects by evaluating complex business processes, providing expertise in applications programming, ensuring application design aligns with the architecture blueprint, developing standards for coding, testing, debugging, and implementation, gaining comprehensive knowledge of business areas integration, analyzing issues to develop innovative solutions, advising mid-level developers and analysts, assessing risks in business decisions, and being a team player who can adapt to changing priorities. The required skills for this role include strong knowledge in Spark using Java/Scala & Hadoop Ecosystem with hands-on experience in Spark Streaming, proficiency in Java Programming with experience in the Spring Boot framework, familiarity with database technologies such as Oracle, Starburst & Impala query engine, and knowledge of bank reconciliations tools like Smartstream TLM Recs Premium / Exceptor / Quickrec is an added advantage. To qualify for this position, you should have 10+ years of relevant experience in Apps Development or systems analysis role, extensive experience in system analysis and programming of software applications, experience in managing and implementing successful projects, be a Subject Matter Expert (SME) in at least one area of Applications Development, ability to adjust priorities quickly, demonstrated leadership and project management skills, clear and concise communication skills, experience in building/implementing reporting platforms, possess a Bachelor's degree/University degree or equivalent experience (Master's degree preferred). This job description is a summary of the work performed, and other job-related duties may be assigned as needed.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

Whether you're at the start of your career or looking to discover your next adventure, your story begins here. At Citi, you'll have the opportunity to expand your skills and make a difference at one of the world's most global banks. We're fully committed to supporting your growth and development from the start with extensive on-the-job training and exposure to senior leaders, as well as more traditional learning. You'll also have the chance to give back and make a positive impact where we live and work through volunteerism. The Product Developer is a strategic professional who stays abreast of developments within their field and contributes to directional strategy by considering their application in their job and the business. Recognized as a technical authority for an area within the business, this role requires basic commercial awareness. Developed communication and diplomacy skills are necessary to guide, influence, and convince others, particularly colleagues in other areas and occasional external customers. The impact of the work is significant on the area through complex deliverables, providing advice and counsel related to the technology or operations of the business. The work impacts an entire area, which eventually affects the overall performance and effectiveness of the sub-function/job family. In this role, you're expected to: - Develop reporting and analytical solutions using various technologies like Python, relational and non-relational databases, Business Intelligence tools, and code orchestrations - Identify solutions ranging across data analytics, reporting, CRM, reference data, Workflows, and trade processing - Design compelling dashboards and Reports using business intelligence tools like Qlikview, Tableau, Pixel-perfect, etc. - Perform data investigations with a high degree of accuracy under tight timelines - Develop plans, prioritize, coordinate design and delivery of products or features to product release, and serve as a product ambassador within the user community - Mentor junior colleagues on technical topics relating to data analytics and software development and conduct code reviews - Follow market, industry, and client trends to own field and adapt them for application to Citi's products and solutions platforms - Work in close coordination with Technology, Business Managers, and other stakeholders to fulfill the delivery objectives - Partner with senior team members and leaders and a widely distributed global user community to define and implement solutions - Appropriately assess risk when making business decisions, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients, and assets by driving compliance with applicable laws, rules, and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct, and business practices, and escalating, managing, and reporting control issues with transparency As a successful candidate, you'd ideally have the following skills and exposure: - 8-12 years of experience using tools for statistical modeling of large data sets and proficient knowledge of data modeling and databases, such as Microsoft SQL Server, Oracle, and Impala - Advanced knowledge of analytical and business intelligence tools including Tableau desktop, Tableau Prep, Tabpy, Access - Familiarity with product development methodologies - Proficient knowledge of programming languages and frameworks such as Python, Visual Basic, and/or R, Apache Airflow, Streamlit and/or Flask, Starburst - Well versed with code versioning tools like github, bitbucket, etc. - Ability to create business analysis, troubleshoot data quality issues, and conduct exploratory and descriptive analysis of business datasets - Ability to structure and break down problems, develop solutions, and drive results - Project Management skills with experience leading large technological initiatives Education: - Bachelor's/University degree, Master's degree preferred Take the next step in your career, apply for this role at Citi today.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Join us as an AWS Developer at Barclays, where you will play a crucial role in supporting the successful delivery of Location Strategy projects. Your responsibilities will include ensuring that projects are completed within the planned budget, meeting quality standards, and adhering to governance protocols. You will be at the forefront of transforming our digital landscape, driving innovation, and striving for excellence to enhance our digital offerings and provide customers with exceptional experiences. As an AWS Developer, your key experience should encompass: - Proficiency in AWS cloud services like S3, Glue, Athena, Lake Formation, and CloudFormation. - Advanced skills in Python for data engineering and automation. - Familiarity with ETL frameworks, data transformation, and data quality tools. Additionally, highly valued skills may involve: - AWS Data Engineer certification. - Previous experience in the banking or financial services sector. - Knowledge of IAM and Permissions management in AWS cloud. - Experience with Databricks, Snowflake, Starburst, and Iceberg. Your performance will be evaluated based on essential skills crucial for success in this role, including risk and controls management, change and transformation capabilities, strategic thinking, and proficiency in digital and technology. You will also be assessed on job-specific technical skills relevant to the position. This position is located in Pune and aims to: - Develop and maintain systems for collecting, storing, processing, and analyzing data to ensure accuracy, accessibility, and security. - Build and manage data architectures pipelines for transferring and processing data effectively. - Design data warehouses and data lakes that handle data volumes, velocity, and security requirements. - Develop algorithms for processing and analyzing data of varying complexity and volumes. - Collaborate with data scientists to construct and deploy machine learning models. As an Assistant Vice President, you are expected to provide guidance, influence decision-making processes, contribute to policy development, and ensure operational efficiency. You will lead a team in executing complex tasks, set objectives, coach employees, evaluate performance, and determine reward outcomes. If you have leadership responsibilities, you must exhibit leadership behaviors such as listening, inspiring, aligning, and developing others. For individual contributors, the role involves leading collaborative assignments, guiding team members, identifying the need for specialized input, proposing new directions for projects, and consulting on complex issues. You will also be responsible for risk management, policy development, and ensuring compliance with governance standards. All colleagues are required to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as demonstrate the Barclays Mindset of Empower, Challenge, and Drive, guiding our behavior and decisions.,

Posted 3 weeks ago

Apply

10.0 - 15.0 years

30 - 40 Lacs

Bengaluru

Hybrid

We are looking for a Cloud Data Engineer with strong hands-on experience in data pipelines, cloud-native services (AWS), and modern data platforms like Snowflake or Databricks. Alternatively, were open to Data Visualization Analysts with strong BI experience and exposure to data engineering or pipelines. You will collaborate with technology and business leads to build scalable data solutions, including data lakes, data marts, and virtualization layers using tools like Starburst. This is an exciting opportunity to work with modern cloud tech in a dynamic, enterprise-scale financial services environment. Key Responsibilities: Design and develop data pipelines for structured/unstructured data in AWS. Build semantic layers and virtualization layers using Starburst or similar tools. Create intuitive dashboards and reports using Power BI/Tableau. Collaborate on ETL designs and support testing (SIT/UAT). Optimize Spark jobs and ETL performance. Implement data quality checks and validation frameworks. Translate business requirements into scalable technical solutions. Participate in design reviews and documentation. Skills & Qualifications: Must-Have: 10+ years in Data Engineering or related roles. Hands-on with AWS Glue, Redshift, Athena, EMR, Lambda, S3, Kinesis. Proficient in HiveQL, Spark, Python, Scala. Experience with modern data platforms (Snowflake/Databricks). 3+ years in ETL tools (Informatica, SSIS) & recent experience in cloud-based ETL. Strong understanding of Data Warehousing, Data Lakes, and Data Mesh. Preferred: Exposure to Data Virtualization tools like Starburst or Denodo. Experience in financial services or banking domain. AWS Certification (Data specialty) is a plus.

Posted 3 weeks ago

Apply

8.0 - 13.0 years

3 - 18 Lacs

Pune, Maharashtra, India

On-site

Your specific responsibilities will include: Design and implementation of last-mile data products using the most up-to-date technologies and software / data / DevOps engineering practices Enable data science & analytics teams to drive data modeling and feature engineering activities aligned with business questions and utilizing datasets in an optimal way Develop deep domain expertise and business acumen to ensure that all specificalities and pitfalls of data sources are accounted for Build data products based on automated data models, aligned with use case requirements, and advise data scientists, analysts and visualization developers on how to use these data models Develop analytical data products for reusability, governance and compliance by design Align with organization strategy and implement semantic layer for analytics data products Support data stewards and other engineers in maintaining data catalogs, data quality measures and governance frameworks Education: B.Tech / B.S., M.Tech / M.S. or PhD in Engineering, Computer Science, Engineering, Pharmaceuticals, Healthcare, Data Science, Business, or related field. Required experience: 8+ years of relevant work experience in the pharmaceutical/life sciences industry, with demonstrated hands-on experience in analyzing, modeling and extracting insights from commercial/marketing analytics datasets (specifically, real-world datasets) High proficiency in SQL, Python and AWS Experience creating / adopting data models to meet requirements from Marketing, Data Science, Visualization stakeholders Experience with including feature engineering Experience with cloud-based (AWS / GCP / Azure) data management platforms and typical storage/compute services (Databricks, Snowflake, Redshift, etc.) Experience with modern data stack tools such as Matillion, Starburst, ThoughtSpot and low-code tools (e.g. Dataiku) Excellent interpersonal and communication skills, with the ability to quickly establish productive working relationships with a variety of stakeholders Experience in analytics use cases of pharmaceutical products and vaccines Experience in market analytics and related use cases Preferred Experience: Experience in analytics use cases focused on informing marketing strategies and commercial execution of pharmaceutical products and vaccines Experience with Agile ways of working, leading or working as part of scrum teams Certifications in AWS and/or modern data technologies Knowledge of the commercial/marketing analytics data landscape and key data sources/vendors Experience in building data models for data science andvisualization/reportingproducts, in collaboration with data scientists, report developers and business stakeholders Experience with data visualization technologies (e.g, PowerBI)

Posted 1 month ago

Apply

4 - 9 years

10 - 12 Lacs

Hyderabad

Remote

Role: BI Analytics Specialist Location: Remote role (Should work in EST hours, Night Shift in India) Work time: 6:00 PM to 3:30 AM IST Duration: 6 Months with possible extension Key Responsibilities: ThoughtSpot Architecture and Implementation: Design and implement ThoughtSpot solutions that meet business requirements. Configure and optimize ThoughtSpot environments for performance and scalability. Develop and maintain data models within ThoughtSpot. Business Intelligence and Analytics: Gather and analyze business requirements to create BI solutions. Design and develop interactive dashboards, reports, and data visualizations. Conduct data analysis to support business decision-making processes. Data Integration and Management: Integrate ThoughtSpot with various data sources, including Databricks, Starburst and AWS. Ensure data quality, integrity, and security within the BI solutions. Collaborate with data engineering teams to establish ETL processes. AWS: Leverage AWS services (e.g., S3, Redshift, Glue, Lambda) for data storage, processing, and analytics. Design and implement scalable data architectures on AWS. Ensure security and compliance of data solutions on AWS. Collaboration and Communication: Work closely with business stakeholders, data engineers, and data scientists to understand requirements and deliver solutions. Provide training and support to end-users on ThoughtSpot and BI tools. Stay updated with the latest industry trends and best practices in BI, ThoughtSpot, Starburst, Databricks, and AWS. Qualifications: Bachelors degree in computer science, Information Technology, Data Science, or a related field. Master’s degree preferred. 5+ years of experience in business intelligence and data analytics. Proven experience with ThoughtSpot , including architecture, implementation, and administration. Strong proficiency in Starburst and Databricks for data processing and analytics. Extensive experience with AWS services related to data analytics (S3, Redshift, Glue, Lambda, etc.). Excellent SQL skills and experience with ETL tools and processes. Strong analytical, problem-solving, and communication skills. Ability to work independently and as part of a team in a fast-paced environment. Knowledge of data governance and security best practices. What You Are Expected to Do: ThoughtSpot environment maintenance – Clean-up, Coordination with TS & Starburst teams, Tracking the system health. Creation of Run books, SOPs, migration documents. Maintaining the connections & the other Admin objects (Tables, Views, etc.). Assisting in Environment migrations & peer review. AD Groups SyncUP Python script enhancements – Script split-up, Nested AD Group SyncUP and other improvements. Preferred Skills: Experience with other BI tools (e.g., Tableau, Power BI). Knowledge of Python for data analysis.

Posted 3 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies