Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 years
0 Lacs
India
On-site
What You'll Do Avalara, Inc., (www.Avalara.com), is the leading provider of cloud-based software that delivers a broad array of compliance solutions related to sales tax and other transactional taxes. We are building cloud-based tax compliance solutions to handle every transaction in the world. Every transaction you make, physical or digital, has a unique and nuanced tax calculation that accompanies it. We do those and we want to do all of them. Avalara is building the global cloud compliance platform, and the Build and Deployment Tooling Team contributes in allowing the development of this platform. Our engineering teams are diverse in their engineering practices, culture, and background. We create the systems that allow them to produce quality products at an increasing pace. As a member of the team, you will take part in architecting the tooling that lowers the barriers for development. You will report to Manager, Site Reliability Engineer This might be a good fit for you, if… Helping people do their best resonates with you. you love platform engineering you want to build cool things with cool people. you love automating everything you love building high impact tools and software which everyone depends on you love automating everything! What Your Responsibilities Will Be Some areas of work are Create tools that smooth the journey from idea to running in production Learn and promote best practices related to the build, test and deployment of software What You’ll Need To Be Successful Qualifications Software Engineering: Understand software engineering fundamentals and have experience developing software among a team of engineers. Experience practicing testing. Build Automation: Experience getting artifacts in many languages packaged and tested so that they can be trusted to go into Production. Automatically. Release Automation: Experience in getting artifacts running in production. Automatically. Observability: Experience developing service level indicators and goals, instrumenting software, and building meaningful alerts. Troubleshooting: Experience tracking down technical causes of distributed software. Containers/Container Orchestration Systems: A understanding of how to manage container-based systems especially on Kubernetes. Artificial Intelligence: A grounding in infrastructure for and the use of Agentic Systems. Infrastructure-as-Code: Experience deploying and maintaining infrastructure as code tools such as Terraform and Pulumi. Technical Writing: We will need to build documentation and diagrams for other engineering teams. Customer Satisfaction: Experience ensuring that code meets all functionality and acceptance criteria for customer satisfaction (our customers are other engineering teams and Avalara customers). GO: Our tooling is developed in GO Distributed Computing: Experience architecting distributed services across regions and clouds. GitLab: Experience working with, managing, and deploying. Artifactory: Experience working with, managing, and deploying. Technical Writing: write technical documents that people love and adore. Open Source: Build side-projects or contribute to other open-source projects. Experience Minimum 6 years of experience in a SaaS environment Bachelor's degree in computer science or equivalent Participate in an on-call rotation Experience with a data warehouse like Snowflake, Redshift, or Spark How We’ll Take Care Of You Total Rewards In addition to a great compensation package, paid time off, and paid parental leave, many Avalara employees are eligible for bonuses. Health & Wellness Benefits vary by location but generally include private medical, life, and disability insurance. Inclusive culture and diversity Avalara strongly supports diversity, equity, and inclusion, and is committed to integrating them into our business practices and our organizational culture. We also have a total of 8 employee-run resource groups, each with senior leadership and exec sponsorship. What You Need To Know About Avalara We’re defining the relationship between tax and tech. We’ve already built an industry-leading cloud compliance platform, processing over 54 billion customer API calls and over 6.6 million tax returns a year. Our growth is real - we're a billion dollar business - and we’re not slowing down until we’ve achieved our mission - to be part of every transaction in the world. We’re bright, innovative, and disruptive, like the orange we love to wear. It captures our quirky spirit and optimistic mindset. It shows off the culture we’ve designed, that empowers our people to win. We’ve been different from day one. Join us, and your career will be too. We’re An Equal Opportunity Employer Supporting diversity and inclusion is a cornerstone of our company — we don’t want people to fit into our culture, but to enrich it. All qualified candidates will receive consideration for employment without regard to race, color, creed, religion, age, gender, national orientation, disability, sexual orientation, US Veteran status, or any other factor protected by law. If you require any reasonable adjustments during the recruitment process, please let us know.
Posted 1 week ago
5.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Greetings from Synergy Resource Solutions, a leading Recruitment Consultancy. Our Client is an ISO 27001:2013 AND ISO 9001 Certified company, and pioneer in web design and development company from India. Company has also been voted as the Top 10 mobile app development companies in India. Company is leading IT Consulting and web solution provider for custom software, website, games, custom web application, enterprise mobility, mobile apps and cloud-based application design & development. Company is ranked one of the fastest growing web design and development company in India, with 3900+ successfully delivered projects across United States, UK, UAE, Canada and other countries. Over 95% of client retention rate demonstrates their level of services and client satisfaction. Position : Senior Data Engineer Experience : 5+ Years relevant experience Education Qualification : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Job Location : Ahmedabad Shift : 11 AM – 8.30 PM Key Responsibilities: Our client seeking an experienced and motivated Senior Data Engineer to join their AI & Automation team. The ideal candidate will have 5–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data driven insights across the organization. Job Description: • Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. • Architect and optimize data storage solutions to ensure reliability, security, and scalability. • Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. • Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. • Develop and enforce data quality standards, governance policies, and monitoring systems to ensure data integrity. • Create and maintain comprehensive documentation for data systems, workflows, and models. • Implement data modeling best practices and optimize data retrieval processes for better performance. • Stay up-to-date with emerging technologies and bring innovative solutions to the team. Qualifications: • Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. • 5–8 years of experience in data engineering, designing and managing large-scale data systems. Strong expertise in database technologies, including: The mandatory skills are as follows: SQL NoSQL (MongoDB or Cassandra, or CosmosDB) One of the following : Snowflake or Redshift or BigQuery or Microsft Fabrics Azure • Hands-on experience implementing and working with generative AI tools and models in production workflows. • Proficiency in Python and SQL, with experience in data processing frameworks (e.g., Pandas, PySpark). • Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms. • Strong understanding of data architecture, data modeling, and data governance principles. • Experience with cloud platforms (preferably Azure) and associated data services. Skills: • Advanced knowledge of Database Management Systems and ETL/ELT processes. • Expertise in data modeling, data quality, and data governance. • Proficiency in Python programming, version control systems (Git), and data pipeline orchestration tools. • Familiarity with AI/ML technologies and their application in data engineering. • Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. • Excellent communication skills, with the ability to explain technical concepts to non-technical stakeholders. • Ability to work independently, lead projects, and mentor junior team members. • Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. If your profile is matching with the requirement & if you are interested for this job, please share your updated resume with details of your present salary, expected salary & notice period.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
The Senior Data Engineer position at Annalect within the Technology team involves building products on cloud-based data infrastructure while collaborating with a team that shares a passion for technology, design, development, and data integration. Your main responsibilities will include designing, building, testing, and deploying data transfers across various cloud environments such as Azure, GCP, AWS, and Snowflake. You will also be tasked with developing data pipelines, monitoring, maintaining, and optimizing them. Writing at-scale data transformations using SQL and Python will be a crucial part of your role. Additionally, you will be expected to conduct code reviews and provide mentorship to junior developers. To excel in this position, you should possess a keen curiosity for understanding the business requirements driving the engineering needs. An enthusiasm for exploring new technologies and bringing innovative ideas to the team is highly valued. A minimum of 3 years of experience in SQL, Python, and Linux is required, along with familiarity with Snowflake, AWS, GCP, and Azure cloud environments. Intellectual curiosity, self-motivation, and a genuine passion for technology are essential attributes for success in this role. For this role, a degree in Computer Science, Engineering, or equivalent practical experience is preferred. Experience with big data, infrastructure setup, and working with relational databases like Postgres, MySQL, and MSSQL is advantageous. Familiarity with data processing tools such as Hadoop, Hive, Spark, and Redshift is beneficial as a significant amount of time will be dedicated to building and optimizing data transformations. The ability to independently manage projects from concept to implementation and maintenance is a key requirement. Working at Annalect comes with various perks, including a vibrant and collaborative work environment with engaging social and learning activities, a generous vacation policy, extended time off during the holiday season, and the advantage of being part of a global company while maintaining a startup-like flexibility and pace. The role offers the opportunity to work with a modern stack and environment, enabling continuous learning and experimentation with cutting-edge technologies to drive innovation.,
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Greater Kolkata Area
Remote
Job Title : Senior Data Analyst, Marketing & Product Location : Remote Type : Full-Time About The Role We are looking for a talented and motivated Senior Data Analyst to join our Marketing and Product Analytics team. In this role, you will work closely with cross-functional teams to analyze data, build models, and generate insights that drive key decisions across our marketing and product functions. This is a fantastic opportunity to apply your analytical skills in a fast paced, data-driven environment with a strong culture of experimentation and impact. Responsibilities Extract, clean, and transform data to ensure quality and usability across analytics initiatives. Build, validate, and deploy machine learning and statistical models to solve business problems. Using your SQL, Python, R expertise to interrogate our data to solve key business problems. Create clear, concise dashboards and reports to communicate findings to stakeholders. Conduct deep-dive analyses to uncover insights on user behavior, marketing performance, and product engagement. Create clear, concise dashboards and reports to communicate findings to stakeholders. Partner with product managers, marketers, and engineers to design experiments and evaluate their outcomes. Experience in taking a business question or problem and turning it into an insight brief Perform deep dives into campaign metrics to identify drivers of performance and recommend actionable strategies. Stay updated on industry trends, best practices, and emerging technologies in Marketing analytics. Proactively bringing new ideas to the table relating to business questions, analytics approaches, datasets and ways of working. Shape our marketing approach actionable insights. Required Skills Proficient in Python and SQL (MySQL or Redshift experience preferred). Experience with data visualization tools (e.g., Tableau, Power BI, or Plotly). Solid understanding of machine learning, A/B testing, customer segmentation and statistical analysis. Strong problem-solving skills and a structured business approach to working with complex datasets. Excellent communication skills with the ability to present findings clearly o non-technical audiences. Highly motivated and collaborative to excel in a fast-paced remote environment. Proven work experience in working on business analytics and marketing insights. Education And Background B.E./B.Tech. from a Tier I College (IITs/NITs/BITS) in Computer Science, Data Science, Engineering, or a related quantitative field. 3-5 years of experience in an analytics/data science role. Prior experience in marketing/customer analytics required. Prior experience working in a consumer-tech, credit/lending or fintech environment is a plus. Experience in a business or marketing analytics product setting, leveraging data science to drive business outcomes. Why Join Us Be part of a high-impact analytics team shaping marketing and product strategy. Work on real-world problems with direct influence on business outcomes. Enjoy a remote-first work environment with flexibility and autonomy. Get access to mentorship, continuous learning, and growth opportunities. (ref:hirist.tech)
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform developer at Barclays, you will play a crucial role in shaping the digital landscape and enhancing customer experiences. Leveraging cutting-edge technology, you will work alongside a team of engineers, business analysts, and stakeholders to deliver high-quality solutions that meet business requirements. Your responsibilities will include tackling complex technical challenges, building efficient data pipelines, and staying updated on the latest technologies to continuously enhance your skills. To excel in this role, you should have hands-on coding experience in Python, along with a strong understanding and practical experience in AWS development. Experience with tools such as Lambda, Glue, Step Functions, IAM roles, and various AWS services will be essential. Additionally, your expertise in building data pipelines using Apache Spark and AWS services will be highly valued. Strong analytical skills, troubleshooting abilities, and a proactive approach to learning new technologies are key attributes for success in this role. Furthermore, experience in designing and developing enterprise-level software solutions, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, Kinesis, and Glue Streaming will be advantageous. Effective communication and collaboration skills are essential to interact with cross-functional teams and document best practices. Your role will involve developing and delivering high-quality software solutions, collaborating with various stakeholders to define requirements, promoting a culture of code quality, and staying updated on industry trends. Adherence to secure coding practices, implementation of effective unit testing, and continuous improvement are integral parts of your responsibilities. As a Data Platform developer, you will be expected to lead and supervise a team, guide professional development, and ensure the delivery of work to a consistently high standard. Your impact will extend to related teams within the organization, and you will be responsible for managing risks, strengthening controls, and contributing to the achievement of organizational objectives. Ultimately, you will be part of a team that upholds Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, while embodying the Barclays Mindset of Empower, Challenge, and Drive in your daily interactions and work ethic.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
Join us as a Cloud Data Engineer at Barclays, where you'll spearhead the evolution of the digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. You may be assessed on key critical skills relevant for success in the role, such as risk and control, change and transformations, business acumen, strategic thinking, and digital technology, as well as job-specific skill sets. To be successful as a Cloud Data Engineer, you should have experience with: - Experience on AWS Cloud technology for data processing and a good understanding of AWS architecture. - Experience with computer services like EC2, Lambda, Auto Scaling, VPC, EC2. - Experience with Storage and container services like ECS, S3, DynamoDB, RDS. - Experience with Management & Governance KMS, IAM, CloudFormation, CloudWatch, CloudTrail. - Experience with Analytics services such as Glue, Athena, Crawler, Lake Formation, Redshift. - Experience with Solution delivery for data processing components in larger End to End projects. Desirable skill sets/good to have: - AWS Certified professional. - Experience in Data Processing on Databricks and unity catalog. - Ability to drive projects technically with right first deliveries within schedule and budget. - Ability to collaborate across teams to deliver complex systems and components and manage stakeholders" expectations well. - Understanding of different project methodologies, project lifecycles, major phases, dependencies and milestones within a project, and the required documentation needs. - Experienced with planning, estimating, organizing, and working on multiple projects. This role will be based out of Pune. Purpose of the role: To build and maintain systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architecture pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage appropriate data volumes and velocity and adhere to required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: - Will have an impact on the work of related teams within the area. - Partner with other functions and business areas. - Takes responsibility for end results of a team's operational processing and activities. - Escalate breaches of policies/procedure appropriately. - Take responsibility for embedding new policies/procedures adopted due to risk mitigation. - Advise and influence decision making within own area of expertise. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Deliver your work and areas of responsibility in line with relevant rules, regulations, and codes of conduct. - Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organization's products, services, and processes within the function. - Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. - Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. - Guide and persuade team members and communicate complex/sensitive information. - Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,
Posted 1 week ago
130.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Description Manager, Data Visualization The Opportunity Based in Hyderabad, join a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organisation driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of the company IT operating model, Tech centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each tech center helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Role Overview A unique opportunity to be part of an Insight & Analytics Data hub for a leading biopharmaceutical company and define a culture that creates a compelling customer experience. Bring your entrepreneurial curiosity and learning spirit into a career of purpose, personal growth, and leadership. We are seeking those who have a passion for using data, analytics, and insights to drive decision-making that will allow us to tackle some of the world's greatest health threats As a manager in Data Visualization, you will be focused on designing and developing compelling data visualizations solutions to enable actionable insights & facilitate intuitive information consumption for internal business stakeholders. The ideal candidate will demonstrate competency in building user-centric visuals & dashboards that empower stakeholders with data driven insights & decision-making capability. Our Quantitative Sciences team use big data to analyze the safety and efficacy claims of our potential medical breakthroughs. We review the quality and reliability of clinical studies using deep scientific knowledge, statistical analysis, and high-quality data to support decision-making in clinical trials. What Will You Do In This Role Design & develop user-centric data visualization solutions utilizing complex data sources. Identify & define key business metrics and KPIs in partnership with business stakeholders. Define & develop scalable data models in alignment & support from data engineering & IT teams. Lead UI UX workshops to develop user stories, wireframes & develop intuitive visualizations. Collaborate with data engineering, data science & IT teams to deliver business friendly dashboard & reporting solutions. Apply best practices in data visualization design & continuously improve upon intuitive user experience for business stakeholders. Provide thought leadership and data visualization best practices to the broader Data & Analytics organization. Identify opportunities to apply data visualization technologies to streamline & enhance manual / legacy reporting deliveries. Provide training & coaching to internal stakeholders to enable a self-service operating model. Co-create information governance & apply data privacy best practices to solutions. Continuously innovative on visualization best practices & technologies by reviewing external resources & marketplace. What Should You Have 5 years’ relevant experience in data visualization, infographics, and interactive visual storytelling Working experience and knowledge in Power BI / QLIK / Spotfire / Tableau and other data visualization technologies Working experience and knowledge in ETL process, data modeling techniques & platforms (Alteryx, Informatica, Dataiku, etc.) Experience working with Database technologies (Redshift, Oracle, Snowflake, etc) & data processing languages (SQL, Python, R, etc.) Experience in leveraging and managing third party vendors and contractors. Self-motivation, proactivity, and ability to work independently with minimum direction. Excellent interpersonal and communication skills Excellent organizational skills, with ability to navigate a complex matrix environment and organize/prioritize work efficiently and effectively. Demonstrated ability to collaborate and lead with diverse groups of work colleagues and positively manage ambiguity. Experience in Pharma and or Biotech Industry is a plus. Our technology teams operate as business partners, proposing ideas and innovative solutions that enable new organizational capabilities. We collaborate internationally to deliver services and solutions that help everyone be more productive and enable innovation. Who We Are We are known as Merck & Co., Inc., Rahway, New Jersey, USA in the United States and Canada and MSD everywhere else. For more than a century, we have been inventing for life, bringing forward medicines and vaccines for many of the world's most challenging diseases. Today, our company continues to be at the forefront of research to deliver innovative health solutions and advance the prevention and treatment of diseases that threaten people and animals around the world. What We Look For Imagine getting up in the morning for a job as important as helping to save and improve lives around the world. Here, you have that opportunity. You can put your empathy, creativity, digital mastery, or scientific genius to work in collaboration with a diverse group of colleagues who pursue and bring hope to countless people who are battling some of the most challenging diseases of our time. Our team is constantly evolving, so if you are among the intellectually curious, join us—and start making your impact today. #HYDIT2025 Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status Regular Relocation VISA Sponsorship Travel Requirements Flexible Work Arrangements Hybrid Shift Valid Driving License Hazardous Material(s) Required Skills Business Intelligence (BI), Clinical Decision Support (CDS), Clinical Testing, Communication, Create User Stories, Data Visualization, Digital Transformation, Healthcare Innovation, Information Technology Operations, IT Operation, Management Process, Marketing, Motivation Management, Requirements Management, Self Motivation, Statistical Analysis, Statistics, Thought Leadership, User Experience (UX) Design Preferred Skills Job Posting End Date 07/31/2025 A job posting is effective until 11 59 59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID R359276
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
chennai, tamil nadu
On-site
You should have experience working with BigID or Collibra, along with knowledge of data classification and data products. It is important to have an understanding of data loss and personal information security. Exposure to platforms such as Snowflake, S3, Redshift, SharePoint, and Box is required. You should also have knowledge of connecting to various source systems. A deep understanding and practical knowledge of IDEs like Eclipse, PyCharm, or any Workflow Designer is essential. Experience with one or more of the following languages - Java, JavaScript, Groovy, Python is preferred. Hands-on experience with CI/CD processes and tooling such as GitHub is necessary. Working experience in DevOps teams based on Kubernetes tools is also expected. Proficiency in database concepts and a basic understanding of data classification, lineage, and storage would be advantageous. Excellent written and spoken English, interpersonal skills, and a collaborative approach to delivery are essential. Desirable Skills And Experience: - A total of 8 to 12 years of overall IT experience - Technical Degree to support your experience - Deep technical expertise - Demonstrated understanding of the required technology and problem-solving skills - Analytical, focused, and capable of working independently with minimal supervision - Good collaborator management and team player - Exposure to platforms like Talend Data Catalog, BigID, or Snowflake is beneficial - Basic knowledge of AWS is a plus - Knowledge and experience with integration technologies such as Mulesoft and SnapLogic - Proficiency in Jira, including the ability to quickly generate JQL queries and save them for reference - Proficient in creating documentation in Confluence - Experience with Agile practices, preferably having been part of an Agile team for several years,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a member of ZS, you have the opportunity to be part of a team where your passion can truly make a difference in people's lives. ZS is a management consulting and technology firm dedicated to enhancing the quality of life and transforming the way we live. Our greatest asset is our people, and by joining us, you will collaborate with a talented group of individuals who are committed to developing innovative solutions that have a positive impact on patients, caregivers, and consumers globally. At ZS, we prioritize a client-first approach, working closely with our clients to create tailored solutions and technology products that drive value and yield measurable results in key areas of their business. At ZS, we recognize that our people are our most valuable asset. We celebrate the diverse elements that make up their identities, personal experiences, and belief systems, as these aspects shape who they are and what makes them unique. We believe that your individual interests, identities, and eagerness to learn contribute to your success within our organization. Learn more about our efforts in diversity, equity, and inclusion, as well as the supportive networks available at ZS to help our employees create community spaces, access necessary resources for growth, and amplify the messages they are passionate about. In this role, you will have the opportunity to: - Lead end-to-end projects leveraging cloud technologies to address complex business challenges - Provide technological expertise to optimize value for clients and project teams - Implement a robust delivery methodology to ensure projects are completed on time, within budget, and to the clients" satisfaction - Design technology solutions that are scalable, resilient, and cost-effective - Mentor and guide project team members to foster continuous learning and professional development - Demonstrate expertise, effective communication, and strong interpersonal skills in interactions with internal teams and clients - Collaborate with ZS experts to drive innovation and mitigate project risks - Engage with global team members to ensure seamless project delivery - Bring structure to ambiguous tasks in developing business cases with clients - Support ZS Leadership in business case development, innovation, thought leadership, and team initiatives We are looking for candidates who meet the following criteria: - Currently enrolled in the junior year of a Bachelor's program or the first year of a Master's program in Business Analytics, Computer Science, MIS, MBA, or a related field - Possess a minimum of 5 years of consulting experience in leading large-scale technology implementations - Strong communication skills to effectively convey technical concepts to diverse audiences - Demonstrated supervisory, coaching, and hands-on project management abilities - Extensive experience with major cloud platforms such as AWS, Azure, and GCP - Proficiency in enterprise data management, advanced analytics, process automation, and application development - Familiarity with industry-standard products and platforms like Snowflake, Databricks, Redshift, Salesforce, Power BI, Cloud - Experience in delivering projects utilizing agile methodologies Additionally, desirable skills include: - Ability to manage a virtual global team for the timely execution of multiple projects - Proficiency in analyzing and troubleshooting interactions between databases, operating systems, and applications - Willingness to travel to global offices as necessary to collaborate with clients and internal project teams ZS offers a comprehensive total rewards package that encompasses health and well-being, financial planning, annual leave, personal growth, and professional development. Our commitment to skills development, multiple career advancement options, internal mobility paths, and collaborative culture empowers you to excel both as an individual and as a global team member. We foster a flexible and connected work environment at ZS, enabling a blend of remote work and on-site presence at clients" offices or ZS locations for most of the week. The essence of ZS culture and innovation thrives in planned and spontaneous face-to-face interactions. Travel is an essential component of the role for client-facing ZS employees, as the needs of your project and client take precedence. While some projects may be local, all client-facing ZS staff should be prepared to travel as required. Travel opportunities provide a chance to strengthen client relationships, gain diverse experiences, and enhance professional growth through exposure to different environments and cultures. If you are interested in joining us at ZS, we invite you to apply even if you do not meet all the specified requirements. We are committed to building a diverse and inclusive company where individuals from all backgrounds can contribute their unique perspectives to create life-changing impact and drive better outcomes for all. ZS is an equal opportunity employer that strives to offer equal employment and advancement opportunities without regard to any protected class under applicable law. To complete your application, candidates must be able to obtain work authorization for their intended country of employment. An online application, along with a comprehensive set of transcripts (official or unofficial), is mandatory for consideration. For further information, visit www.zs.com.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
Landor is seeking a talented 3D Motion Designer to join the Global Design Studio in India. As a part of Landor, a world-leading brand specialist, you will play a key role in connecting business strategy to brand identity. Your work will involve bringing brands to life through innovative design solutions and creating brand-led experiences for both talent and customers. To excel in this role, you should have essential software knowledge in tools such as AfterEffects for motion design and compositing, Blender / Cinema 4D for 3D modelling, and Illustrator, XD (or Figma), and Photoshop for 2D design and image manipulation. Proficiency in Premiere Pro for video editing is also required. Additionally, it would be beneficial to have knowledge of Redshift / Octane or equivalent render engines, creative coding tools like Processing / P5.js / Cinder for generative motion, and Lottie / CSS animation for web and UI motion. Familiarity with Trapcode suite for procedural effects in AfterEffects and Adobe Animate for keyframe animation are also desirable skills. As an equal opportunity employer, WPP values diversity and considers all applicants for positions without discrimination. The company is dedicated to creating a culture of inclusivity where every individual feels respected, valued, and has equal opportunities for career advancement. If you are passionate about motion design and eager to contribute to the transformative work of building brands, Landor welcomes your application to be a part of a dynamic team that strives to make a positive impact in the industry.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
The role requires you to understand the business functionalities and technical/database landscapes of the applications under i360. You will collaborate with Database Engineers and Data Analysts to comprehend the requirements and testing needs. Building and maintaining a test automation framework will be a crucial part of your responsibilities, including creating robust and maintainable test cases covering various ETL processes, database systems, and data analyses. Implementing Quality Engineering Strategies, best practices, and guidelines to ensure scalability, reusability, and maintainability is an essential aspect of the role. As part of the position, you will be expected to identify, replicate, and report defects, as well as verify defect fixes. Data accuracy, completeness, and consistency will be validated using ETL tools and SQL queries. Being proactive, adaptable to changes, and possessing strong communication skills (both verbal and written) are key attributes for success in this role. Expertise in DB-related testing and ETL testing, along with strong Python programming skills and proficiency in SQL and ETL tools like pandas and great expectations, are necessary. Knowledge of SQL and experience working with databases such as Redshift, Elasticsearch, OpenSearch, Postgres, and Snowflake is required. Additionally, familiarity with analyzing population data and demographics, version control using Gitlab, pipeline integration, and working under pressure with strong attention to detail are essential qualities for this position. The role also involves contribution motivation, good verbal and written communication skills, mentorship, knowledge sharing, experience with Jira, knowledge of Agile methodology, and hands-on experience in DevOps like Gitlab CI/CD pipeline. If you possess strong analytical, problem-solving, and troubleshooting skills and stay updated on current market trends, this position might be suitable for you. This is a Contractual/Temporary job with a Day shift schedule and an in-person work location. To apply for this position, please send your resumes to gopi@nithminds.com.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
delhi
On-site
As an FX Artist at Black Diamond Media & Production Pvt Ltd in Rohini, Delhi, you will play a crucial role in creating captivating visual effects for high-quality animated content. Your primary responsibilities will involve crafting realistic and stylized effects such as smoke, fire, magic, destruction, liquids, and more to enhance the overall visual appeal of our projects. Your key responsibilities will include creating top-notch visual effects using industry-standard tools, collaborating effectively with animators, lighting, and compositing teams, simulating particles, dynamics, and fluid effects for animation scenes, ensuring that the visual effects align with the project's style and vision, optimizing FX for performance and rendering pipeline, as well as troubleshooting and resolving technical issues related to FX production. To excel in this role, you should have at least 2 years of experience as an FX Artist in the animation/VFX/gaming industry. Proficiency in software such as Houdini, Maya, Blender, After Effects, or EmberGen is essential. A strong understanding of physics-based simulations, knowledge of render engines like Arnold, Redshift, or Octane, creativity, attention to detail, excellent teamwork, and communication skills are also required. It would be advantageous if you have experience with the Unreal Engine/Niagara FX system, knowledge of Python or MEL scripting for tool development, and familiarity with cartoon-style FX or stylized animation. If you are ready to contribute to redefining the Indian animation industry with groundbreaking visuals and creative storytelling, we invite you to apply by sending your updated portfolio/showreel and CV to hr@blackdiamonds.co.in with the subject line "Application for FX Artist [Your Name]". Join us at Black Diamond Media and be a part of something extraordinary!,
Posted 1 week ago
1.0 - 31.0 years
2 - 3 Lacs
Dadar West, Mumbai/Bombay
On-site
Proven experience in 3D for advertising, film, or product visualization. Proficiency in Maya, Blender, Cinema 4D, or 3ds Max. Experience with rendering engines like V-Ray, Redshift, or Arnold. Strong understanding of modeling, UV mapping, lighting, and shading. Familiarity with Substance Painter, Photoshop, and After Effects. Excellent eye for detail, form, and visual aesthetics. Ability to manage time across multiple projects and meet tight deadlines. Portfolio demonstrating high-end CGI work, especially for commercial or product-based ads.
Posted 1 week ago
4.0 - 8.0 years
0 Lacs
hyderabad, telangana
On-site
We are looking for a highly skilled and experienced Senior Power BI Developer to join our dynamic data and analytics team. As a Senior Power BI Developer, you will be responsible for designing, developing, and implementing robust and insightful business intelligence solutions using Power BI. Your role will involve translating complex business requirements into clear, interactive, and high-performance dashboards and reports to drive data-driven decision-making across the organization. As a Senior Power BI Developer, your responsibilities will include leading the design, development, and implementation of complex, interactive, and user-friendly dashboards and reports using Power BI. You will also be required to translate diverse business requirements into technical specifications and impactful data visualizations, as well as develop and optimize Power BI datasets, analyses, and dashboards for performance, scalability, and maintainability. Additionally, you will implement advanced Power BI features such as rbac, parameters, calculated fields, custom visuals, and dynamic filtering to ensure data accuracy, consistency, and integrity within all Power BI reports and dashboards. Your role will also involve identifying and addressing performance bottlenecks in Power BI dashboards and underlying data sources, implementing best practices for Power BI security, user access, and data governance, and monitoring Power BI usage and performance to recommend improvements as needed. Furthermore, you will ensure compliance with data security policies and governance guidelines when handling sensitive data within Power BI. To be successful in this role, you should have a Bachelor's degree in Computer Science, Information Technology, Data Analytics, or a related field, along with 8-10 years of overall IT experience, including 4-5 years of hands-on experience designing and developing complex dashboards and reports using Power BI. You should have strong proficiency in SQL, an in-depth understanding of BI, and extensive experience with various AWS services relevant to data analytics, such as Redshift and RDS. Additionally, you should possess excellent analytical, problem-solving, and critical thinking skills, as well as strong communication and interpersonal skills to collaborate effectively with technical and non-technical stakeholders. Experience working in an Agile development methodology, the ability to work independently, manage multiple priorities, and meet tight deadlines are also essential for this role. Preferred skills that would be beneficial include experience with other BI tools (e.g., Tableau), proficiency in Python or other scripting languages for data manipulation and automation. Stay updated with the latest features, releases, and best practices in Power BI to proactively identify opportunities for process improvement and automation in BI development workflows.,
Posted 1 week ago
5.0 - 10.0 years
10 - 20 Lacs
Pune
Hybrid
We are seeking a highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing, building, and optimizing data pipelines and architectures to support our growing data-driven initiatives. Knowledge of machine learning techniques and frameworks is a significant advantage and will allow you to collaborate closely with our data science team. Key Responsibilities: - Design, implement, and maintain scalable data pipelines for collecting, processing, and analyzing large datasets. - Build and optimize data architectures to support business intelligence, analytics, and machine learning models. - Collaborate with data scientists, analysts, and software engineers to ensure seamless data integration and accessibility. - Develop and maintain ETL (Extract, Transform, Load) workflows and tools. - Monitor and troubleshoot data systems to ensure high availability and performance. - Implement and enforce best practices for data security, governance, and quality. - Evaluate and integrate new technologies to enhance data engineering capabilities. Qualifications: - Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. - Proven experience as a Data Engineer or in a similar role. - Proficiency in programming languages such as Python, Java, or Scala. - Hands-on experience with data pipeline tools (e.g., Apache Airflow, AWS Glue). - Strong knowledge of SQL and database systems (e.g., PostgreSQL, MySQL, MongoDB). - Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark). - Familiarity with data modeling, schema design, and data warehousing concepts. - Understanding of CI/CD pipelines and version control systems like Git. Preferred Skills: - Familiarity with machine learning frameworks (e.g., TensorFlow, PyTorch, scikit-learn). - Experience deploying machine learning models and working with MLOps tools. - Knowledge of distributed systems and real-time data processing (e.g., Kafka, Flink).
Posted 1 week ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
dunnhumby is the global leader in Customer Data Science, empowering businesses everywhere to compete and thrive in the modern data-driven economy. We always put the Customer First. Our mission: to enable businesses to grow and reimagine themselves by becoming advocates and champions for their Customers. With deep heritage and expertise in retail – one of the world’s most competitive markets, with a deluge of multi-dimensional data – dunnhumby today enables businesses all over the world, across industries, to be Customer First. dunnhumby employs nearly 2,500 experts in offices throughout Europe, Asia, Africa, and the Americas working for transformative, iconic brands such as Tesco, Coca-Cola, Meijer, Procter & Gamble and Metro. We are seeking a talented Engineering Manager with ML Ops expertise to lead a team of engineers in developing product that help Retailers transform their Retail Media business in a way that helps them achieve maximum ad revenue and enable massive scale. As an Engineering Manager, you will play a pivotal role in designing and delivering high-quality software solutions. You will be responsible for leading a team, mentoring engineers, contributing to system architecture, and ensuring adherence to engineering best practices. Your technical expertise, leadership skills, and ability to drive results will be key to the success of our products. What you will be doing? You will lead the charge in ensuring operational efficiency and delivering high-value solutions . You’ll mentor and develop a high-performing team of Big Data and MLOps engineers, driving best practices in software development, data management, and model deployment. With a focus on robust technical design, you’ll ensure solutions are secure, scalable, and efficient. Your role will involve hands-on development to tackle complex challenges, collaborating across teams to define requirements, and delivering innovative solutions. You’ll keep stakeholders and senior management informed on progress, risks, and opportunities while staying ahead of advancements in AI/ML technologies and driving their application. With an agile mindset, you will overcome challenges and deliver impactful solutions that make a difference. Technical Expertise Proven experience in microservices architecture, with hands-on knowledge of Docker and Kubernetes for orchestration. Proficiency in ML Ops and Machine Learning workflows using tools like Spark. Strong command of SQL and PySpark programming. Expertise in Big Data solutions such as Spark and Hive, with advanced Spark optimizations and tuning skills. Hands-on experience with Big Data orchestrators like Airflow. Proficiency in Python programming, particularly with frameworks like FastAPI or equivalent API development tools. Experience in unit testing, code quality assurance, and the use of Git or other version control systems. Cloud And Infrastructure Practical knowledge of cloud-based data stores, such as Redshift and BigQuery (preferred). Experience in cloud solution architecture, especially with GCP and Azure. Familiarity with GitLab CI/CD pipelines is a bonus. Monitoring And Scalability Solid understanding of logging, monitoring, and alerting systems for production-level big data pipelines. Prior experience with scalable architectures and distributed processing frameworks. Soft Skills And Additional Plus Points A collaborative approach to working within cross-functional teams. Ability to troubleshoot complex systems and provide innovative solutions. Familiarity with GitLab for CI/CD and infrastructure automation tools is an added advantage. What You Can Expect From Us We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One, dh Enabled and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here)
Posted 1 week ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Us SentiLink provides innovative identity and risk solutions, empowering institutions and individuals to transact confidently with one another. By building the future of identity verification in the United States and reinventing the currently clunky, ineffective, and expensive process, we believe strongly that the future will be 10x better. We’ve had tremendous traction and are growing extremely quickly. Already our real-time APIs have helped verify hundreds of millions of identities, beginning with financial services. In 2021, we raised a $70M Series B round, led by Craft Ventures to rapidly scale our best in class products. We’ve earned coverage and awards from TechCrunch, CNBC, Bloomberg, Forbes, Business Insider, PYMNTS, American Banker, LendIt, and have been named to the Forbes Fintech 50 list consecutively since 2023. Last but not least, we’ve even been a part of history -- we were the first company to go live with the eCBSV and testified before the United States House of Representatives. About The Role Are you passionate about creating world-class solutions that fuel product stability and continuously improve infrastructure operations? We’re looking for a driven Infrastructure Engineer to architect, implement, and maintain powerful observability systems that safeguard the performance and reliability of our most critical systems. In this role, you’ll take real ownership—collaborating with cross-functional teams to shape best-in-class observability standards, troubleshoot complex issues, and fine-tune monitoring tools to exceed SLA requirements. If you’re ready to design high-quality solutions, influence our technology roadmap, and make a lasting impact on our product’s success, we want to meet you! Responsibilities Improve alerting across SentiLink systems and services, developing high quality monitoring capabilities while actively reducing false positives. Troubleshoot, debug, and resolve infrastructure issues as they arise; participate in on-call rotations for production issues. Define and refine Service Level Indicators (SLI), Service Level Objectives (SLO), and Service Level Agreements (SLA) in collaboration with product and engineering teams. Develop monitoring and alerting configurations using IaC solutions such as Terraform. Build and maintain dashboards to provide visibility into system performance and reliability. Collaborate with engineering teams to improve root cause analysis processes and reduce Mean Time to Recovery (MTTR). Drive cost optimization for observability tools like Datadog, CloudWatch, and Sumo Logic. Perform capacity testing to determine a deep understanding of infrastructure performance under load. Develop alerting based on learnings. Oversee, develop, and operate Kubernetes and service mesh infrastructure, ensuring smooth performance and reliability Investigate operational alerts, identify root causes, and compile comprehensive root cause analysis reports. Pursue action items relentlessly until they are thoroughly completed Conduct in-depth examinations of database operational issues, actively developing and improving database architecture, schema, and configuration for enhanced performance and reliability Develop and maintain incident response runbooks and improve processes to minimize service downtime. Research and evaluate new observability tools and technologies to enhance system monitoring. Requirements 3 years of experience in cloud infrastructure, DevOps, or systems engineering. Expertise in AWS and infrastructure-as-code development. Experience with CI/CD pipelines and automation tools. Experience managing observability platforms, building monitoring dashboards, and configuring high quality, actionable alerting Strong understanding of Linux systems and networking. Familiarity with container orchestration tools like Kubernetes or Docker. Excellent analytical and problem-solving skills. Experience operating enterprise-size databases. Postgres, Aurora, Redshift, and OpenSearch experience is a plus Experience with Python or Golang is a plus Perks Employer paid group health insurance for you and your dependents 401(k) plan with employer match (or equivalent for non US-based roles) Flexible paid time off Regular company-wide in-person events Home office stipend, and more! Corporate Values Follow Through Deep Understanding Whatever It Takes Do Something Smart
Posted 1 week ago
5.0 - 10.0 years
20 - 35 Lacs
Hyderabad, Pune
Work from Office
We are seeking a Sr. Data Engineer to join our Data Engineering team within our Enterprise Data Insights organization to build data solutions, design and implement ETL/ELT processes and manage our data platform to enable our cross functional stakeholders. As a part of our Corporate Engineering division, our vision is to spearhead technology and data-led solutions and experiences to drive growth & innovation at scale. The ideal candidate will have a strong Data Engineering background, advanced Python knowledge and experience with cloud services and SQL/NoSQL databases. You will work closely with our cross functional stakeholders in Product, Finance and GTM along with Business and Enterprise Technology teams. As a Senior Data Engineer, you will: Collaborating closely with various stakeholders to prioritize requests, identify improvements, and offer recommendations. Taking the lead in analyzing, designing, and implementing data solutions, which involves constructing and designing data models and ETL processes. Cultivating collaboration with corporate engineering, product teams, and other engineering groups. Leading and mentoring engineering discussions, advocating for best practices. Actively participating in design and code reviews. Accessing and exploring third-party data APIs to determine the data required to meet business needs. Ensuring data quality and integrity across different sources and systems. Managing data pipelines for both analytics and operational purposes. Continuously enhancing processes and policies to improve SLA and SOX compliance. You'll be a great addition to the team if you have: Hold a B.S., M.S., or Ph.D. in Computer Science or a related technical field. Possess over 5 years of experience in Data Engineering, focusing on building and maintaining data environments. Demonstrate at least 5 years of experience in designing and constructing ETL/ELT processes, managing data solutions within an SLA-driven environment. Exhibit a strong background in developing data products, APIs, and maintaining testing, monitoring, isolation, and SLA processes. Possess advanced knowledge of SQL/NoSQL databases (such as Snowflake, Redshift, MongoDB). Proficient in programming with Python or other scripting languages. Have familiarity with columnar OLAP databases and data modeling. Experience in building ELT/ETL processes using tools like dbt, AirFlow, Fivetran, CI/CD using GitHub, and reporting in Tableau. Possess excellent communication and interpersonal skills to effectively collaborate with various business stakeholders and translate requirements. Added bonus if you also have: A good understanding of Salesforce & Netsuite systems Experience in SAAS environments Designed and deployed ML models Experience with events and streaming data
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Soul AI Pods Deccan AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from top tier institutes like IITs, NITs, and BITS. We’re hiring for our client servicing arm Soul AI. Soul AI has a talent network, known as Soul AI Pods , under which highly skilled & vetted tech talent gets the opportunity to work with top-tier global tech companies. Our client list includes tech giants like Google & Snowflake .Read more about her eResponsibilitie sDesign, build, and maintain ETL/ELT pipeline s using tools like Airflo w, DB T, or Spar kDevelop and optimize data lake s, data warehouse s, and streaming pipeline sEnsure data quality, reliability, and lineag e across sources and pipeline sIntegrate structured and unstructured data from internal and third-party API sCollaborate with ML teams to deliver production-ready feature pipeline s, labeling dat a, and dataset versionin gImplement data governanc e, security, and access control policie sRequired Skill s Strong SQL skills including analytical queries, CTEs, window functions, and query optimizati onProficient in Python for data manipulation and scripting using libraries like Pandas and Num PyExperience with ETL orchestration tools such as Airflow, Prefect, or Lui giHands-on with batch and streaming data processing using Spark, Kafka, or Fli nkFamiliarity with data lakes and warehouses (S3, BigQuery, Redshift, Snowflake) and schema desi gnBonus: experience with DBT, data validation, MLOps integration, or compliance-aware data workflo ws Application & Other Deta ilsTo apply, fill t he Soul AI Pods Interest F ormYou will be invited for selection process → R1: Test, R2: AI Interview, R3: 1:1 Interv iewWe are hiring for full-time or long Contract (40 hrs/week) hybrid ro lesWe are hiring across different seniority lev elsYou will work on a key client project (Top-tier tech consulting fi rm)
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Come help Amazon create cutting-edge data and science-driven technologies for delivering packages to the doorstep of our customers! The Last Mile Routing & Planning organization builds the software, algorithms and tools that make the “magic” of home delivery happen: our flow, sort, dispatch and routing intelligence systems are responsible for the billions of daily decisions needed to plan and execute safe, efficient and frustration-free routes for drivers around the world. Our team supports deliveries (and pickups!) for Amazon Logistics, Prime Now, Amazon Flex, Amazon Fresh, Lockers, and other new initiatives. As part of the Last Mile Science & Technology organization, you’ll partner closely with Product Managers, Data Scientists, and Software Engineers to drive improvements in Amazon's Last Mile delivery network. You will leverage data and analytics to generate insights that accelerate the scale, efficiency, and quality of the routes we build for our drivers through our end-to-end last mile planning systems. You will present your analyses, plans, and recommendations to senior leadership and connect new ideas to drive change. Analytical ingenuity and leadership, business acumen, effective communication capabilities, and the ability to work effectively with cross-functional teams in a fast paced environment are critical skills for this role. Responsibilities Create actionable business insights through analytical and statistical rigor to answer business questions, drive business decisions, and develop recommendations to improve operations Collaborate with Product Managers, software engineering, data science, and data engineering partners to design and develop analytic capabilities Define and govern key business metrics, build automated dashboards and analytic self-service capabilities, and engineer data-driven processes that drive business value Navigate ambiguity to develop analytic solutions and shape work for junior team members Basic Qualifications 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared) Experience with scripting language (e.g., Python, Java, or R) Preferred Qualifications Master's degree, or Advanced technical degree Knowledge of data modeling and data pipeline design Experience with statistical analysis, co-relation analysis Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka Job ID: A2994628
Posted 1 week ago
8.0 years
30 - 38 Lacs
Gurgaon
Remote
Role: AWS Data Engineer Location: Gurugram Mode: Hybrid Type: Permanent Job Description: We are seeking a talented and motivated Data Engineer with requisite years of hands-on experience to join our growing data team. The ideal candidate will have experience working with large datasets, building data pipelines, and utilizing AWS public cloud services to support the design, development, and maintenance of scalable data architectures. This is an excellent opportunity for individuals who are passionate about data engineering and cloud technologies and want to make an impact in a dynamic and innovative environment. Key Responsibilities: Data Pipeline Development: Design, develop, and optimize end-to-end data pipelines for extracting, transforming, and loading (ETL) large volumes of data from diverse sources into data warehouses or lakes. Cloud Infrastructure Management: Implement and manage data processing and storage solutions in AWS (Amazon Web Services) using services like S3, Redshift, Lambda, Glue, Kinesis, and others. Data Modeling: Collaborate with data scientists, analysts, and business stakeholders to define data requirements and design optimal data models for reporting and analysis. Performance Tuning & Optimization: Identify bottlenecks and optimize query performance, pipeline processes, and cloud resources to ensure cost-effective and scalable data workflows. Automation & Scripting: Develop automated data workflows and scripts to improve operational efficiency using Python, SQL, or other scripting languages. Collaboration & Documentation: Work closely with data analysts, data scientists, and other engineering teams to ensure data availability, integrity, and quality. Document processes, architectures, and solutions clearly. Data Quality & Governance: Ensure the accuracy, consistency, and completeness of data. Implement and maintain data governance policies to ensure compliance and security standards are met. Troubleshooting & Support: Provide ongoing support for data pipelines and troubleshoot issues related to data integration, performance, and system reliability. Qualifications: Essential Skills: Experience: 8+ years of professional experience as a Data Engineer, with a strong background in building and optimizing data pipelines and working with large-scale datasets. AWS Experience: Hands-on experience with AWS cloud services, particularly S3, Lambda, Glue, Redshift, RDS, and EC2. ETL Processes: Strong understanding of ETL concepts, tools, and frameworks. Experience with data integration, cleansing, and transformation. Programming Languages: Proficiency in Python, SQL, and other scripting languages (e.g., Bash, Scala, Java). Data Warehousing: Experience with relational and non-relational databases, including data warehousing solutions like AWS Redshift, Snowflake, or similar platforms. Data Modeling: Experience in designing data models, schema design, and data architecture for analytical systems. Version Control & CI/CD: Familiarity with version control tools (e.g., Git) and CI/CD pipelines. Problem-Solving: Strong troubleshooting skills, with an ability to optimize performance and resolve technical issues across the data pipeline. Desirable Skills: Big Data Technologies: Experience with Hadoop, Spark, or other big data technologies. Containerization & Orchestration: Knowledge of Docker, Kubernetes, or similar containerization/orchestration technologies. Data Security: Experience implementing security best practices in the cloud and managing data privacy requirements. Data Streaming: Familiarity with data streaming technologies such as AWS Kinesis or Apache Kafka. Business Intelligence Tools: Experience with BI tools (Tableau, Quicksight) for visualization and reporting. Agile Methodology: Familiarity with Agile development practices and tools (Jira, Trello, etc.) Job Type: Permanent Pay: ₹3,000,000.00 - ₹3,800,000.00 per year Benefits: Work from home Schedule: Day shift Monday to Friday Experience: Data Engineering: 5 years (Required) AWS Elastic MapReduce (EMR): 3 years (Required) AWS : 3 years (Required) Work Location: In person
Posted 1 week ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description A Data Engineer Extraordinaire will possess masterful proficiency in crafting scalable and efficient solutions for data processing and analysis. With expertise in database management, ETL processes, and data modelling, they design robust pipelines using cutting-edge technologies such as Apache Spark and Hadoop. Their proficiency extends to cloud platforms like AWS, Azure, or Google Cloud Platform, where they leverage scalable resources to build resilient data ecosystems. This exceptional individual possesses a deep understanding of business requirements, collaborating closely with stakeholders to ensure that data infrastructure aligns with organizational objectives. Through their technical acumen and innovative spirit, they pave the way for data-driven insights and empower organizations to thrive in the digital age Key Responsibilities Develop and maintain cutting-edge data pipeline architecture, ensuring optimal performance and scalability. Building seamless ETL pipeline for diverse sources leveraging advanced big data technologies Craft advanced analytics tools that leverage the robust data pipeline, delivering actionable insights to drive business decisions Prototype and iterate test solutions for identified functional and technical challenges, driving innovation and problem-solving Champion ETL best practices and standards, ensuring adherence to industry-leading methodologies Collaborate closely with stakeholders across Executive, Product, Data, and Design teams, addressing data-related technical challenges and supporting their infrastructure needs Thrive in a dynamic, cross-functional environment, working collaboratively to drive innovation and deliver impactful solutions Required Skills and Qualifications Proficient in SQL, Python, Spark, and data transformation techniques Experience with Cloud Platforms AWS, Azure, or Google Cloud (GCP) for deploying and managing data services Data Orchestration Proficient in using orchestration tools such as Apache Airflow, Azure Data Factory (ADF), or similar tools for managing complex workflows Data Platform Experience Hands-on experience with Databricks or similar platforms for data engineering workloads Familiarity with Data Lakes and Warehouses Experience working with data lakes, data warehouses (Redshift/SQL Server/Big Query), and big data processing architectures Version Control & CI/CD Proficient in Git, GitHub, or similar version control systems, and comfortable working with CI/CD pipelines Data Security Knowledge of data governance, encryption, and compliance practices within cloud environments Problem-solving Analytical thinking and problem-solving mindset, with a passion for optimizing data workflows Preferred Skills and Qualifications Bachelor's degree or equivalent degrees in computer science, Engineering, or a related field 3+ years of experience in data engineering or related roles Hands-on experience with distributed computing and parallel data processing Good to Have Streaming Tools Experience with Kafka, Event Hubs, Amazon SQS, or equivalent streaming technologies Experience in Containerization Familiarity with Docker and Kubernetes for deploying scalable data solutions Engage in peer review processes and present research findings at esteemed ML/AI conferences such as NIPS, ICML, AAAI and COLT Experiment with latest advancements in Data Engineering tools, platforms, and methodologies. Mentor peers and junior members and handle multiple projects at the same time Participate and speak at various external forums such as research conferences and technical summits Promote and support company policies, procedures, mission, values, and standards of ethics and integrity Certifications in AWS, Azure, or GCP are a plus Understanding of modern data architecture patterns, including the Lambda and Kappa architectures
Posted 1 week ago
2.0 - 4.0 years
3 - 7 Lacs
Thiruvananthapuram
Remote
About the Company Armada is an edge computing startup that provides computing infrastructure to remote areas where connectivity and cloud infrastructure is limited, as well as areas where data needs to be processed locally for real-time analytics and AI at the edge. We’re looking to bring on the most brilliant minds to help further our mission of bridging the digital divide with advanced technology infrastructure that can be rapidly deployed anywhere . About the role We are looking for a detail-oriented and technically skilled BI Engineer to design, build, and maintain robust data pipelines and visualization tools that empower data-driven decision-making across the organization. The ideal candidate will work closely with stakeholders to translate business needs into actionable insights by developing and optimizing BI solutions. Location. This role is office-based at our Trivandrum, Kerala office. What You'll Do (Key Responsibilities) Design, develop, and maintain scalable ETL (Extract, Transform, Load) pipelines to support data integration from multiple sources. Build and optimize data models and data warehouses for business reporting and analysis. Develop dashboards, reports, and data visualizations using BI tools (e.g., Power BI, Tableau, Looker, etc.). Collaborate with data analysts, data scientists, and business stakeholders to understand reporting needs and deliver effective solutions. Ensure data accuracy, consistency, and integrity across reporting systems. Perform data validation, cleansing, and transformation as necessary. Identify opportunities to automate processes and improve reporting efficiency. Monitor BI tools and infrastructure performance, and troubleshoot issues as needed. Stay up-to-date with emerging BI technologies and best practices. Required Qualifications Bachelor’s degree in Computer Science, Information Systems, Data Science, or a related field. 2–4 years of experience as a BI Engineer, Data Engineer, or similar role. Proficiency in SQL and experience with data modeling and data warehousing (e.g., Snowflake, Redshift, BigQuery). Experience with BI and data visualization tools (e.g., Power BI, Tableau, Qlik, Looker). Strong understanding of ETL processes and data pipeline design. Excellent problem-solving skills and attention to detail. Preferred: Experience with Python, R, or other scripting languages for data manipulation. Familiarity with cloud platforms (e.g., AWS, Azure, Google Cloud Platform). Knowledge of version control (e.g., Git) and CI/CD practices. Experience with APIs, data governance, and data cataloging tools. Compensation We offer a competitive base salary along with equity options, providing an opportunity to share in the success and growth of Armada. #LI-JV1 #LI-Onsite You're a Great Fit if You're A go-getter with a growth mindset. You're intellectually curious, have strong business acumen, and actively seek opportunities to build relevant skills and knowledge A detail-oriented problem-solver. You can independently gather information, solve problems efficiently, and deliver results with a "get-it-done" attitude Thrive in a fast-paced environment. You're energized by an entrepreneurial spirit, capable of working quickly, and excited to contribute to a growing company A collaborative team player. You focus on business success and are motivated by team accomplishment vs personal agenda Highly organized and results-driven. Strong prioritization skills and a dedicated work ethic are essential for you Equal Opportunity Statement At Armada, we are committed to fostering a work environment where everyone is given equal opportunities to thrive. As an equal opportunity employer, we strictly prohibit discrimination or harassment based on race, color, gender, religion, sexual orientation, national origin, disability, genetic information, pregnancy, or any other characteristic protected by law. This policy applies to all employment decisions, including hiring, promotions, and compensation. Our hiring is guided by qualifications, merit, and the business needs at the time.
Posted 1 week ago
3.0 years
7 - 8 Lacs
Hyderābād
On-site
Full-time Employee Status: Regular Role Type: Hybrid Department: Information Technology & Systems Schedule: Full Time Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description Job description What you’ll be doing The Senior Data Engineer will help build the next generation of cloud-based data tools and reporting for Experian’s MCE contact center division. Valuable, accurate, and timely information is core to our success, and this highly impactful role will be an essential part of that. Delivery pace and meeting our commitments are a primary focus to ensure that we are providing information at the speed of business. As part of this, understanding the business-side logic, environment, and workflows is important, and in effect, we need someone that is an incredible problem solver. If you are a self-driven, determined engineer that loves data, creating cutting edge tools, and moving fast, this position is for you! We are a results-oriented team that is looking to attract and reward high performing individuals. Come join us! Responsibilities include: Complex Dataset Construction : Construct datasets using complex, custom stored procedures, views, and queries. Strong SQL development skills are a must, preferably within Redshift and/or PostgreSQL . Full-stack Data Solutions : Develop full lifecycle data solutions from data ingestion (using custom AWS-based data movement/ETL processes via Glue with Python code) to downstream real-time and historical reports. Business Need to Execution Focus : Understand data-driven business objectives and develop solutions leveraging various technologies and solve for those needs. Along with great problem-solving skills, a strong desire to learn our operational environment is a necessity. Delivery Speed Enablement : Build reusable data-related tools, CI/CD pipelines, and automated testing. Enable DevOps model usage focused on continuous improvement, and ultimately reduce unnecessary dependencies. Shift Security Left : Ensure security components and requirements are implemented via automation up-front as part of all solutions being developed. Focus on the Future : Stay current on industry best practices and emerging technologies and proactively translate those into data platform improvements. Be a Great Team Player : Train team members in proper coding techniques, create proper documentation as needed, and be a solid leader on the team as a senior-level engineer. Support US Operations : Operate partially within US Eastern time zone to ensure appropriate alignment and coordination with the US-based teams. What your background looks like Qualifications Required: Extensive experience in modern data manipulation and preparation via SQL code and translating business requirements into usable reports. Solid automation skillset and ability to design and create solutions to drive out manual data/report assembly processes within an organization. Experience constructing reports within a BI tool while also taking ownership of upstream and downstream elements. Able to create CI/CD pipelines that perform code deployments and automated testing. Ability to identify business needs and proactively create reporting tools that will consistently add value. Strong ability and willingness to help others and be an engaged part of the team. Patience and a collaborative personality are a must; we need a true team player that can help strengthen our overall group. Goal-driven individual; must have a proven career track record of achievement. We want the best of the best and reward stellar performers! Skills: 3+ years developing complex SQL code required, preferably within Redshift and/or PostgreSQL 1+ years using Python, Java, C#, or other similar object-oriented language CI/CD pipeline construction, preferably using GitHub Actions Git experience General knowledge of AWS Services, with a preference in Glue and Lambda. Infrastructure-as-code (CloudFormation, Terraform, or similar product) a plus Google Looker experience a plus (not required) Qualifications Qualifications We are looking for 4 to 8 years of experience in which 3+ years developing complex SQL code required, preferably within Redshift and/or PostgreSQL 1+ years using Python, Java, C#, or other similar object-oriented language CI/CD pipeline construction, preferably using GitHub Actions General knowledge of AWS Services, with a preference in Glue and Lambda. Infrastructure-as-code (CloudFormation, Terraform, or similar product) a plus Google Looker experience a plus (not required) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is an important part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together
Posted 1 week ago
6.0 years
7 - 8 Lacs
Hyderābād
On-site
Full-time Employee Status: Regular Role Type: Hybrid Department: Analytics Schedule: Full Time Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description Senior Data Engineer is responsible for design, develop and support ETL data pipelines solutions primary in AWS environment Design, develop, and maintain scaled ETL process to deliver meaningful insights from large and complicated data sets. Work as part of a team to build out and support data warehouse, implement solutions using PySpark to process structured and unstructured data. Play key role in building out a semantic layer through development of ETLs and virtualized views. Collaborate with Engineering teams to discovery and leverage new data being introduced into the environment Support existing ETL processes written in SQL, or leveraging third party APIs with Python, troubleshoot and resolve production issues. Strong SQL and data to understand and troubleshoot existing complex SQL. Hands-on experience with Apache Airflow or equivalent tools (AWS MWAA) for orchestration of data pipelines Create and maintain report specifications and process documentations as part of the required data deliverables. Serve as liaison with business and technical teams to achieve project objectives, delivering cross functional reporting solutions. Troubleshoot and resolve data, system, and performance issues Communicating with business partners, other technical teams and management to collect requirements, articulate data deliverables, and provide technical designs. Qualifications you have completed graduation from BE/Btech 6 to 9 years of experience in Data Engineering development 5 years of experience in Python scripting You should have 8 years experience in SQL, 5+years in Datawarehouse, 5yrs in Agile and 3yrs with Cloud 3 years of experience with AWS ecosystem (Redshift, EMR, S3, MWAA) 5 years of experience in Agile development methodology You will work with the team to create solutions Proficiency in CI/CD tools (Jenkins, GitLab, etc.) Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is an important part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. #LI-Onsite Benefits Experian care for employee's work life balance, health, safety and wellbeing. 1) In support of this endeavor, we offer the best family well-being benefits, 2) Enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough