Home
Jobs

1802 Redshift Jobs - Page 24

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Key Responsibilities: Project Management: Lead the end-to-end delivery of data projects, including Data Warehouse, Data Lake, and Lakehouse solutions. Develop detailed project plans, allocate resources, and monitor project progress to ensure timely and within-budget delivery. Identify and mitigate risks, ensuring successful project outcomes. Technical Leadership: Provide technical oversight and guidance on best practices in data engineering, cloud architecture, and data management. Ensure solutions are scalable, robust, and align with industry standards and client requirements. Oversee the design, development, and implementation of data solutions using Azure or AWS and Databricks. Client Engagement: Engage with clients to understand their business needs and translate them into technical requirements. Build and maintain strong relationships with key client stakeholders. Present complex technical concepts and solutions in a clear and concise manner to non- technical stakeholders. Team Leadership: Lead and mentor a team of data engineers, fostering a collaborative and high-performance culture. Provide guidance and support to team members in their professional development and project delivery. Ensure the team is equipped with the necessary tools and resources to succeed. Solution Development: Develop and implement data pipelines, ETL processes, and data integration solutions using Azure Data Factory, AWS Glue, Databricks, and other relevant tools. Optimize data storage and retrieval performance, ensuring data quality and integrity. Leverage advanced analytics and machine learning capabilities within Databricks to drive business insights. Continuous Improvement: Stay up-to-date with the latest advancements in Azure, AWS, Databricks, and data engineeringtechnologies. Implement best practices and continuous improvement initiatives to enhance the efficiency and effectiveness of data engineering processes. Foster a culture of innovation and experimentation within the team. Skills & Competencies  Strong problem-solving and analytical skills.  Deep technical expertise in Azure, Google or AWS and Databricks.  Exceptional project management and organizational abilities.  High level of emotional intelligence and client empathy.  Proficiency In Concepts of Data warehousing and Data Lake (e.g., SCD1, SCD2, Dimensional Modeling, KPIs and Measures, Data Catalog, Star and Snowflake schema, Delta Table and Delta Live Tables) Data warehousing solutions (e.g., Azure Synapse, Azure SQL, ADLS Gen2, BLOB Storage for Azure and Redshift, S3, AWS Glue AWS Lambda for AWS, Google data management technologies ) Data Lake solutions (e.g., MS Fabric, Purview, AWS Lakehouse, BigQuery and BigTable ) Lakehouse solutions (e.g., Databricks, Unity Catalog, Python and Pyspark) Data visualization tools (e.g., Power BI,Tableau) is a plus Mandatory Skill Sets Project Management, Azure, AWS Preferred Skill Sets Project Management, Azure, AWS Years Of Experience Required 10+ Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills AWS Devops, Microsoft Azure, Waterfall Model Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Reasoning, Analytical Thinking, Application Software, Business Data Analytics, Business Management, Business Technology, Business Transformation, Coaching and Feedback, Communication, Creativity, Documentation Development, Embracing Change, Emotional Regulation, Empathy, Implementation Research, Implementation Support, Implementing Technology, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Performance Assessment {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 week ago

Apply

8.0 - 10.0 years

2 - 8 Lacs

Hyderābād

On-site

GlassDoor logo

Join Amgen’s Mission of Serving Patients At Amgen, if you feel like you’re part of something bigger, it’s because you are. Our shared mission—to serve patients living with serious illnesses—drives all that we do. Since 1980, we’ve helped pioneer the world of biotech in our fight against the world’s toughest diseases. With our focus on four therapeutic areas –Oncology, Inflammation, General Medicine, and Rare Disease– we reach millions of patients each year. As a member of the Amgen team, you’ll help make a lasting impact on the lives of patients as we research, manufacture, and deliver innovative medicines to help people live longer, fuller happier lives. Our award-winning culture is collaborative, innovative, and science based. If you have a passion for challenges and the opportunities that lay within them, you’ll thrive as part of the Amgen team. Join us and transform the lives of patients while transforming your career. Senior Manager Technology – US Commercial Data & Analytics What you will do Let’s do this. Let’s change the world. In this vital role you will lead the engagement model between Amgen's Technology organization and our global business partners in Commercial Data & Analytics. We seek a technology leader with a passion for innovation and a collaborative working style that partners effectively with business and technology leaders. Are you interested in building a team that consistently delivers business value in an agile model using technologies such as AWS, Databricks, Airflow, and Tableau? Come join our team! Roles & Responsibilities: Establish an effective engagement model to collaborate with senior leaders on the Sales Insights product team within the Commercial Data & Analytics organization, focused on operations within the United States Serve as the technology product owner for an agile product team committed to delivering business value to Commercial stakeholders via data pipeline buildout for sales data Lead and mentor junior team members to deliver on the needs of the business Interact with business clients and technology management to create technology roadmaps, build cases, and drive DevOps to achieve the roadmaps Help to mature Agile operating principles through deployment of creative and consistent practices for user story development, robust testing and quality oversight, and focus on user experience Ability to connect and understand our vast array Commercial and other functional data sources including Sales, Activity, and Digital data, etc. into consumable and user-friendly modes (e.g., dashboards, reports, mobile, etc.) for key decision makers such as executives, brand leads, account managers, and field representatives. Become the lead subject matter expert in reporting technology capabilities by researching and implementing new tools and features, internal and external methodologies What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Master’s degree with 8 - 10 years of experience in Information Systems experience OR Bachelor’s degree with 10 - 14 years of experience in Information Systems experience OR Diploma with 14 - 18 years of experience in Information Systems experience Must-Have Skills Excellent problem-solving skills and a passion for tackling complex challenges in data and analytics with technology Experience leading data and analytics teams in a Scaled Agile Framework (SAFe) Excellent interpersonal skills, strong attention to detail, and ability to influence based on data and business value Ability to build compelling business cases with accurate cost and effort estimations Has experience with writing user requirements and acceptance criteria in agile project management systems such as Jira Ability to explain sophisticated technical concepts to non-technical clients Strong understanding of sales and incentive compensation value streams Preferred Qualifications: Jira Align & Confluence experience Experience of DevOps, Continuous Integration, and Continuous Delivery methodology Understanding of software systems strategy, governance, and infrastructure Experience in managing product features for PI planning and developing product roadmaps and user journeys Familiarity with low-code, no-code test automation software Technical thought leadership Soft Skills: Able to work effectively across multiple geographies (primarily India, Portugal, and the United States) under minimal supervision Demonstrated proficiency in written and verbal communication in English language Skilled in providing oversight and mentoring team members. Demonstrated ability in effectively delegating work Intellectual curiosity and the ability to question partners across functions Ability to prioritize successfully based on business value High degree of initiative and self-motivation Ability to manage multiple priorities successfully across virtual teams Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills Technical Skills: ETL tools: Experience in ETL tools such as Databricks Redshift or equivalent cloud-based dB Big Data, Analytics, Reporting, Data Lake, and Data Integration technologies S3 or equivalent storage system AWS (similar cloud-based platforms) BI Tools (Tableau and Power BI preferred) What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 week ago

Apply

130.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Job Description Our company is an innovative, global healthcare leader that is committed to improving health and well-being around the world with a diversified portfolio of prescription medicines, vaccines and animal health products. We continue to focus our research on conditions that affect millions of people around the world - diseases like Alzheimer's, diabetes and cancer - while expanding our strengths in areas like vaccines and biologics. Our ability to excel depends on the integrity, knowledge, imagination, skill, diversity and teamwork of an individual like you. To this end, we strive to create an environment of mutual respect, encouragement and teamwork. As part of our global team, you’ll have the opportunity to collaborate with talented and dedicated colleagues while developing and expanding your career. As a Digital Supply Chain Data Modeler/Engineer, you will work as a member of the Digital Manufacturing Division team supporting Enterprise Orchestration Platform. You will be responsible for identifying, assessing, and solving complex business problems related to manufacturing and supply chain. You will receive training to achieve this, and you’ll be amazed at the diversity of opportunities to develop your potential and grow professionally. You will collaborate with business stakeholders and determine analytical capabilities that will enable the creation of Insights-focused solutions that align to business needs and ensure that delivery of these solutions meet quality requirements. The Opportunity Based in Hyderabad, joining a global healthcare biopharma company and be part of a 130- year legacy of success backed by ethical integrity, forward momentum, and an inspiring mission to achieve new milestones in global healthcare. Be part of an organization driven by digital technology and data-backed approaches that support a diversified portfolio of prescription medicines, vaccines, and animal health products. Drive innovation and execution excellence. Be a part of a team with passion for using data, analytics, and insights to drive decision-making, and which creates custom software, allowing us to tackle some of the world's greatest health threats. Our Technology Centers focus on creating a space where teams can come together to deliver business solutions that save and improve lives. An integral part of our company’s IT operating model, Tech Centers are globally distributed locations where each IT division has employees to enable our digital transformation journey and drive business outcomes. These locations, in addition to the other sites, are essential to supporting our business and strategy. A focused group of leaders in each Tech Centre helps to ensure we can manage and improve each location, from investing in growth, success, and well-being of our people, to making sure colleagues from each IT division feel a sense of belonging to managing critical emergencies. And together, we must leverage the strength of our team to collaborate globally to optimize connections and share best practices across the Tech Centers. Job Description As Data modeler lead, you will be responsible for following but not limited to, Deliver divisional analytics initiatives with primary focus on data modeling for all analytics, advanced analytics and AI/ML uses cases e,g Self Services, Business Intelligence & Analytics, Data exploration, Data Wrangling etc. Host and lead requirement/process workshop to understand the requirements of data modeling. Analysis of business requirements and work with architecture team to deliver & contribute to feasibility analysis, implementation plans and high-level estimates. Based on business process and analysis of data sources, deliver detailed ETL design with mapping of data model covering all areas of Data warehousing for all analytics use cases. Creation of data model & transformation mapping in modeling tool and deploy in databases including creation of schedule orchestration jobs. Deployment of Data modeling configuration to Target systems (SIT, UAT & Prod) . Understanding of Product ownership and management. Lead Data model as a product for focus areas of Digital supply chain domain. Creation of required SDLC documentation as per project requirements. Optimization/industrialization of existing database and data transformation solution Prepare and update Data modeling and Data warehousing best practices along with foundational platforms. Work very closely with foundational product teams, Business, vendors, and technology support teams to build team to deliver business initiatives Position Qualifications : Education Minimum Requirement: - B.S. or M.S. in IT, Engineering, Computer Science, or related fields. Required Experience and Skills**: 5+ years of relevant work experience, with demonstrated expertise in Data modeling in DWH, Data Mesh or any analytics related implementation; experience in implementing end to end DWH solutions involving creating design of DWH and deploying the solution 3+ years of experience in creating logical & Physical data models in any modeling tool (SAP Power designer, WhereScape etc ). Experience in creating data modeling standards, best practices and Implementation process. High Proficiency in Information Management, Data Analysis and Reporting Requirement Elicitation Experience working with extracting business rules to develop transformations, data lineage, and dimension data modeling Experience working with validating legacy and developed data model outputs Development experience using WhereScape and various similar ETL/Data Modeling tools Exposure to Qlik or similar BI dashboarding applications Has advanced knowledge of SQL and data transformation practices Has deep understanding of data modelling and preparation of optimal data structures Is able to communicate with business, data transformation team and reporting team Has knowledge of ETL methods, and a willingness to learn ETL technologies Can fluently communicate in English Experience in Redshift or similar databases using DDL, DML, Query optimization, Schema management, Security, etc. Experience with Airflow or similar various orchestration tools Exposure to CI/CD tools Exposure to AWS modules such as S3, AWS Console, Glue, Spectrum, etc management Independently support business discussions, analyze, and develop/deliver code Work during and with the US and European team with the overlap work hours. Preferred Experience and Skills: Experience working on projects where Agile methodology is leveraged Understanding of data management best practices and data analytics Ability to lead requirements sessions with clients and project teams Strong leadership, verbal and written communication skills with ability to articulate results and issues to internal and client teams Demonstrated experience in the Life Science space Exposure to SAP and RapidResponse domain data is a plus Current Employees apply HERE Current Contingent Workers apply HERE Search Firm Representatives Please Read Carefully Merck & Co., Inc., Rahway, NJ, USA, also known as Merck Sharp & Dohme LLC, Rahway, NJ, USA, does not accept unsolicited assistance from search firms for employment opportunities. All CVs / resumes submitted by search firms to any employee at our company without a valid written search agreement in place for this position will be deemed the sole property of our company. No fee will be paid in the event a candidate is hired by our company as a result of an agency referral where no pre-existing agreement is in place. Where agency agreements are in place, introductions are position specific. Please, no phone calls or emails. Employee Status: Regular Relocation: VISA Sponsorship: Travel Requirements: Flexible Work Arrangements: Not Applicable Shift: Valid Driving License: Hazardous Material(s): Required Skills: Agile Data Warehousing, Agile Methodology, Animal Vaccination, Business, Business Communications, Business Intelligence (BI), Computer Science, Database Administration, Data Engineering, Data Management, Data Modeling, Data Visualization, Data Warehousing (DW), Design Applications, Digital Supply Chain, Digital Supply Chain Management, Digital Transformation, Information Management, Information Technology Operations, Physical Data Models, Software Development, Software Development Life Cycle (SDLC), Supply Chain Optimization, Supply Management, System Designs Preferred Skills: Job Posting End Date: 06/30/2025 A job posting is effective until 11:59:59PM on the day BEFORE the listed job posting end date. Please ensure you apply to a job posting no later than the day BEFORE the job posting end date. Requisition ID: R351878

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 10 Lacs

Hyderābād

On-site

GlassDoor logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. AWS Data Engineer- Senior We are seeking a highly skilled and motivated Hands on AWS Data Engineer with 5-10 years of experience in AWS Glue, Pyspark ,AWS Redshift, S3, and Python to join our dynamic team. As a Data Engineer, you will be responsible for designing, developing, and optimizing data pipelines and solutions that support business intelligence, analytics, and large-scale data processing. You will work closely with data scientists, analysts, and other engineering teams to ensure seamless data flow across our systems. Technical Skills : Must have Strong experience in AWS Data Services like Glue , Lambda, Even bridge, Kinesis, S3/ EMR , Redshift , RDS, Step functions, Airflow & Pyspark Strong exposure to IAM, Cloud Trail , Cluster optimization , Python & SQL Should have expertise in Data design, STTM, understanding of Data models , Data component design, Automated testing, Code Coverage, UAT support , Deployment and go live Experience with version control systems like SVN, Git. Create and manage AWS Glue crawlers and jobs to automate data cataloging and ingestion processes across various structured and unstructured data sources. Strong experience with AWS Glue building ETL pipelines, managing crawlers, and working with Glue data catalogue. Proficiency in AWS Redshift designing and managing Redshift clusters, writing complex SQL queries, and optimizing query performance. Enable data consumption from reporting and analytics business applications using AWS services (ex: QuickSight, Sagemaker, JDBC / ODBC connectivity, etc.) Behavioural skills: Willing to work 5 days a week from ODC / client location ( based on project can be hybrid 3 days a week ) Ability to Lead developers and engage with client stakeholders to drive technical decisions Ability to do technical design and POCs- help build / analyse logical data model, required entities, relationships, data constraints and dependencies focused on enabling reporting and analytics business use cases Should be able to work in Agile environment Should have strong communication skills Good to have : Exposure to Financial Services , Wealth and Asset Management Exposure to Data science, Exposure to Fullstack technologies GenAI will be an added advantage EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

2.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We’re looking for a talented and motivated 3D Designer to join a leading 360-degree advertising agency and help elevate our creative vision. In this role, you’ll have the opportunity to create stunning 3D visuals for various media, including print, digital, and experiential advertising campaigns. Key Responsibilities: ∙Design and produce high-quality 3D models, assets, and animations for a wide range of advertising campaigns, from print and digital to experiential and social media. ∙Collaborate with creative directors, art directors, and other design teams to ensure seamless integration of 3D designs into 360-degree campaigns. ∙Develop engaging 3D visuals for brand activation's, product launches, virtual environments, and digital storytelling. ∙Create photo realistic renderings, animations, and interactive 3D content that aligns with the brand’s message and visual identity. ∙Assist in conceptualising and visualising design ideas from initial sketches to final 3D presentations. ∙Participate in brainstorming sessions to generate creative ideas and provide 3D design solutions for a variety of advertising needs. ∙Work with other departments (motion graphics, video production, digital teams, etc.) to ensure consistency and high quality across all mediums. ∙Prepare and deliver assets for print, digital media, social campaigns, and video content while maintaining brand guidelines. ∙Stay updated on the latest trends in 3D design, motion graphics, and advertising technologies. ∙Meet project deadlines while maintaining the highest quality standards and attention to detail. Qualifications: ∙2-5 years of Experience as a 3D Designer in a creative or advertising environment. ∙Strong proficiency in 3D design software such as 3D Max , Auto desk Maya, Cinema 4D, Blender, or equivalent. ∙Experience in rendering software (V-Ray, Arnold, Redshift, etc.) for photo realistic output. ∙Proficient in Adobe Creative Suite, especially Photoshop, Illustrator, After Effects, and Premiere Pro. ∙A solid understanding of 3D modelling, texturing, lighting, and animation principles. ∙Experience working with both static and animated visuals across multiple platforms (digital, print, experiential, etc.). ∙Strong communication skills with the ability to present and explain design concepts to clients and team members. ∙Ability to work independently and collaboratively within a fast-paced environment. ∙Detail-oriented with a strong sense of creativity, innovation, and problem-solving. Preferred Skills: ∙Experience in augmented reality (AR) or virtual reality (VR) design. ∙Knowledge of motion graphics, video editing, and interactive media. ∙Familiarity with the full creative process, from concept development to final delivery. ∙Strong understanding of experiential marketing and the ability to bring physical installations into the digital space. Show more Show less

Posted 1 week ago

Apply

3.0 years

0 - 0 Lacs

Gurgaon

On-site

GlassDoor logo

Job description At Dreamstead Interactive, We are looking for a Mid Lighting rendering Artist that would play a vital role in our productions and successfully deliver all our films. It is their responsibility to interpret and implement the creative aims of the client within schedule and budget. Job Responsibility  Producing High-Quality Photoreal 3D content for Digital and Television.  Drive Production with the rest of the team with good communication, planning, etc.  Advance knowledge and experience in producing Photoreal 3D content with hands-on Advance experience in Texturing, Shading, Lighting, and Rendering ( Redshift / Arnold ) to produce said visuals.  Expert in Maya and Blender  Ability to optimize render settings for quick renders while still maintaining high visual quality output.  Ability to reduce complex problems to their most simple solutions  General compositing principles (AfterEffects/ Fusion / Nuke)  Complete high-quality work, on time, on the budget, and to specification  Unreal Engine experience will be a good addon. Must-Have  At least 3 years of experience in the production of high-end 3D Visuals.  Experience working on High-Quality CG Commercial Projects.  Excellent communicator.  Excellent creative eye, aesthetic judgment, and interest in visual exploration of abstract concepts.  Superior attention to detail and Highly Organized.  Interest in emerging technologies, R&D.  Self-motivated problem solver.  Technically and creatively excelled. A PORTFOLIO IS MANDATORY TO BE CONSIDERED FOR THIS POSITION. APPLICANTS WITHOUT A PORTFOLIO WOULD NOT BE CONSIDERED. Job Types: Full-time, Permanent Pay: ₹20,000.00 - ₹40,000.00 per month Schedule: Day shift Monday to Friday Morning shift Experience: lightening artist : 3 years (Required) Work Location: In person

Posted 1 week ago

Apply

55.0 years

6 - 9 Lacs

Pune

Remote

GlassDoor logo

Capgemini Invent Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses. Your Role Has data pipeline implementation experience with any of these cloud providers - AWS, Azure, GCP. Experience with cloud storage, cloud database, cloud data warehousing and Data Lake solutions like Snowflake, Big query, AWS Redshift, ADLS, S3. Has good knowledge of cloud compute services and load balancing. Has good knowledge of cloud identity management, authentication and authorization. Proficiency in using cloud utility functions such as AWS lambda, AWS step functions, Cloud Run, Cloud functions, Azure functions. Experience in using cloud data integration services for structured, semi structured and unstructured data such as Azure Databricks, Azure Data Factory, Azure Synapse Analytics, AWS Glue, AWS EMR, Dataflow, Dataproc. Your Profile Good knowledge of Infra capacity sizing, costing of cloud services to drive optimized solution architecture, leading to optimal infra investment vs performance and scaling. Able to contribute to making architectural choices using various cloud services and solution methodologies. Expertise in programming using python. Very good knowledge of cloud Dev-ops practices such as infrastructure as code, CI/CD components, and automated deployments on cloud. Must understand networking, security, design principles and best practices in cloud. What you will love about working here We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance. At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities. Equip yourself with valuable certifications in the latest technologies such as Generative AI. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, cloud and data, combined with its deep industry expertise and partner ecosystem. The Group reported 2023 global revenues of €22.5 billion.

Posted 1 week ago

Apply

10.0 years

5 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Location: Bangalore - Karnataka, India - EOIZ Industrial Area Worker Type Reference: Regular - Permanent Pay Rate Type: Salary Career Level: T4(A) Job ID: R-45392-2025 Description & Requirements Introduction: A Career at HARMAN HARMAN Technology Services (HTS) We’re a global, multi-disciplinary team that’s putting the innovative power of technology to work and transforming tomorrow. At HARMAN DTS, you solve challenges by creating innovative solutions. Combine the physical and digital, making technology a more dynamic force to solve challenges and serve humanity’s needs Work at the convergence of cross channel UX, cloud, insightful data, IoT and mobility Empower companies to create new digital business models, enter new markets, and improve customer experiences About the Role We are seeking an experienced “Azure Data Architect” who will develop and implement data engineering project including enterprise data hub or Big data platform. Develop and implement data engineering project including data lake house or Big data platform What You Will Do Create data pipelines for more efficient and repeatable data science projects Design and implement data architecture solutions that support business requirements and meet organizational needs Collaborate with stakeholders to identify data requirements and develop data models and data flow diagrams Work with cross-functional teams to ensure that data is integrated, transformed, and loaded effectively across different platforms and systems Develop and implement data governance policies and procedures to ensure that data is managed securely and efficiently Develop and maintain a deep understanding of data platforms, technologies, and tools, and evaluate new technologies and solutions to improve data management processes Ensure compliance with regulatory and industry standards for data management and security. Develop and maintain data models, data warehouses, data lakes and data marts to support data analysis and reporting. Ensure data quality, accuracy, and consistency across all data sources. Knowledge of ETL and data integration tools such as Informatica, Qlik Talend, and Apache NiFi. Experience with data modeling and design tools such as ERwin, PowerDesigner, or ER/Studio Knowledge of data governance, data quality, and data security best practices Experience with cloud computing platforms such as AWS, Azure, or Google Cloud Platform. Familiarity with programming languages such as Python, Java, or Scala. Experience with data visualization tools such as Tableau, Power BI, or QlikView. Understanding of analytics and machine learning concepts and tools. Knowledge of project management methodologies and tools to manage and deliver complex data projects. Skilled in using relational database technologies such as MySQL, PostgreSQL, and Oracle, as well as NoSQL databases such as MongoDB and Cassandra. Strong expertise in cloud-based databases such as AWS 3/ AWS glue , AWS Redshift, Iceberg/parquet file format Knowledge of big data technologies such as Hadoop, Spark, snowflake, databricks , and Kafka to process and analyze large volumes of data. Proficient in data integration techniques to combine data from various sources into a centralized location. Strong data modeling, data warehousing, and data integration skills. What You Need 10+ years of experience in the information technology industry with strong focus on Data engineering, architecture and preferably as data engineering lead 8+ years of data engineering or data architecture experience in successfully launching, planning, and executing advanced data projects. Experience in working on RFP/ proposals, presales activities, business development and overlooking delivery of Data projects is highly desired A master’s or bachelor’s degree in computer science, data science, information systems, operations research, statistics, applied mathematics, economics, engineering, or physics. Candidate should have demonstrated the ability to manage data projects and diverse teams. Should have experience in creating data and analytics solutions. Experience in building solutions with Data solutions in any one or more domains – Industrial, Healthcare, Retail, Communication Problem-solving, communication, and collaboration skills. Good knowledge of data visualization and reporting tools Ability to normalize and standardize data as per Key KPIs and Metrics Develop and implement data engineering project including data lakehouse or Big data platform Develop and implement data engineering project including data lakehouse or Big data platform What is Nice to Have Knowledge of Azure Purview is must Knowledge of Azure Data fabric Ability to define reference data architecture Snowflake Certified in SnowPro Advanced Certification Ability to define reference data architecture Cloud native data platform experience in AWS or Microsoft stack Knowledge about latest data trends including datafabric and data mesh Robust knowledge of ETL and data transformation and data standardization approaches Key contributor on growth of the COE and influencing client revenues through Data and analytics solutions Lead the selection, deployment, and management of Data tools, platforms, and infrastructure. Ability to guide technically a team of data engineers Oversee the design, development, and deployment of Data solutions Define, differentiate & strategize new Data services/offerings and create reference architecture assets Drive partnerships with vendors on collaboration, capability building, go to market strategies, etc. Guide and inspire the organization about the business potential and opportunities around Data Network with domain experts Collaborate with client teams to understand their business challenges and needs. Develop and propose Data solutions tailored to client specific requirements. Influence client revenues through innovative solutions and thought leadership. Lead client engagements from project initiation to deployment. Build and maintain strong relationships with key clients and stakeholders Build re-usable Methodologies, Pipelines & Models What Makes You Eligible Build and manage a high-performing team of Data engineers and other specialists. Foster a culture of innovation and collaboration within the Data team and across the organization. Demonstrate the ability to work in diverse, cross-functional teams in a dynamic business environment. Candidates should be confident, energetic self-starters, with strong communication skills. Candidates should exhibit superior presentation skills and the ability to present compelling solutions which guide and inspire. Provide technical guidance and mentorship to the Data team Collaborate with other stakeholders across the company to align the vision and goals Communicate and present the Data capabilities and achievements to clients and partners Stay updated on the latest trends and developments in the Data domain What We Offer Access to employee discounts on world class HARMAN/Samsung products (JBL, Harman Kardon, AKG etc.). Professional development opportunities through HARMAN University’s business and leadership academies. An inclusive and diverse work environment that fosters and encourages professional and personal development. “Be Brilliant” employee recognition and rewards program. You Belong Here HARMAN is committed to making every employee feel welcomed, valued, and empowered. No matter what role you play, we encourage you to share your ideas, voice your distinct perspective, and bring your whole self with you – all within a support-minded culture that celebrates what makes each of us unique. We also recognize that learning is a lifelong pursuit and want you to flourish. We proudly offer added opportunities for training, development, and continuing education, further empowering you to live the career you want. About HARMAN: Where Innovation Unleashes Next-Level Technology Ever since the 1920s, we’ve been amplifying the sense of sound. Today, that legacy endures, with integrated technology platforms that make the world smarter, safer, and more connected. Across automotive, lifestyle, and digital transformation solutions, we create innovative technologies that turn ordinary moments into extraordinary experiences. Our renowned automotive and lifestyle solutions can be found everywhere, from the music we play in our cars and homes to venues that feature today’s most sought-after performers, while our digital transformation solutions serve humanity by addressing the world’s ever-evolving needs and demands. Marketing our award-winning portfolio under 16 iconic brands, such as JBL, Mark Levinson, and Revel, we set ourselves apart by exceeding the highest engineering and design standards for our customers, our partners and each other. If you’re ready to innovate and do work that makes a lasting impact, join our talent community today! Important Notice: Recruitment Scams Please be aware that HARMAN recruiters will always communicate with you from an '@harman.com' email address. We will never ask for payments, banking, credit card, personal financial information or access to your LinkedIn/email account during the screening, interview, or recruitment process. If you are asked for such information or receive communication from an email address not ending in '@harman.com' about a job with HARMAN, please cease communication immediately and report the incident to us through: harmancareers@harman.com. HARMAN is proud to be an Equal Opportunity / Affirmative Action employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.

Posted 1 week ago

Apply

10.0 years

7 - 10 Lacs

Vadodara

On-site

GlassDoor logo

About Rearc Founded in 2016, we pride ourselves on fostering an environment where creativity flourishes, bureaucracy is non-existent, and individuals are encouraged to challenge the status quo. We're not just a company; we're a community of problem-solvers dedicated to improving the lives of fellow software engineers. Our commitment is simple - finding the right fit for our team and cultivating a desire to make things better. If you're a cloud professional intrigued by our problem space and eager to make a difference, you've come to the right place. Join us, and let's solve problems together! As a Lead Data Engineer at Rearc, you'll play a pivotal role in establishing and maintaining technical excellence within our data engineering team. Your deep expertise in data architecture, ETL processes, and data modelling will be instrumental in optimizing data workflows for efficiency, scalability, and reliability. You'll collaborate closely with cross-functional teams to design and implement robust data solutions that meet business objectives and adhere to best practices in data management. Building strong partnerships with both technical teams and stakeholders will be essential as you drive data-driven initiatives and ensure their successful implementation. What You Bring With 10+ years of experience in data engineering, data architecture, or related fields, you offer a wealth of expertise in managing and optimizing data pipelines and architectures. Extensive experience in writing and testing Java and/or Python Proven experience with data pipeline orchestration using platforms such as Airflow, Databricks, DBT or AWS Glue. Hands-on experience with data analysis tools and libraries like Pyspark, NumPy, Pandas, or Dask. Proficiency with Spark and Databricks is highly desirable. You have a proven track record of leading complex data engineering projects, including designing and implementing scalable data solutions. Your hands-on experience with ETL processes, data warehousing, and data modeling tools allows you to deliver efficient and robust data pipelines. You possess in-depth knowledge of data integration tools and best practices. Your strong understanding of cloud-based data services and technologies (e.g., AWS Redshift, Azure Synapse Analytics, Google BigQuery). You bring strong strategic and analytical skills to the role, enabling you to solve intricate data challenges and drive data-driven decision-making. Proven proficiency in implementing and optimizing data pipelines using modern tools and frameworks, including Databricks for data processing and Delta Lake for managing large-scale data lakes. Your exceptional communication and interpersonal skills facilitate collaboration with cross-functional teams and effective stakeholder engagement at all levels. What You’ll Do As a Lead Data Engineer at Rearc, your role is pivotal in driving the success of our data engineering initiatives. You will lead by example, fostering trust and accountability within your team while leveraging your technical expertise to optimize data processes and deliver exceptional data solutions. Here's what you'll be doing: Understand Requirements and Challenges : Collaborate with stakeholders to deeply understand their data requirements and challenges, enabling the development of robust data solutions tailored to the needs of our clients. Implement with a DataOps Mindset : Embrace a DataOps mindset and utilize modern data engineering tools and frameworks, such as Apache Airflow, Apache Spark, or similar, to build scalable and efficient data pipelines and architectures. Lead Data Engineering Projects : Take the lead in managing and executing data engineering projects, providing technical guidance and oversight to ensure successful project delivery. Mentor Data Engineers : Share your extensive knowledge and experience in data engineering with junior team members, guiding and mentoring them to foster their growth and development in the field. Promote Knowledge Sharing : Contribute to our knowledge base by writing technical blogs and articles, promoting best practices in data engineering, and contributing to a culture of continuous learning and innovation. At Rearc, we're committed to empowering engineers to build awesome products and experiences. Success as a business hinges on our people's ability to think freely, challenge the status quo, and speak up about alternative problem-solving approaches. If you're an engineer driven by the desire to solve problems and make a difference, you're in the right place! Our approach is simple — empower engineers with the best tools possible to make an impact within their industry. We're on the lookout for engineers who thrive on ownership and freedom, possessing not just technical prowess, but also exceptional leadership skills. Our ideal candidates are hands-on-keyboard leaders who don't just talk the talk but also walk the walk, designing and building solutions that push the boundaries of cloud computing.

Posted 1 week ago

Apply

8.0 years

7 - 10 Lacs

Vadodara

On-site

GlassDoor logo

At Rearc, we're committed to empowering engineers to build awesome products and experiences. Success as a business hinges on our people's ability to think freely, challenge the status quo, and speak up about alternative problem-solving approaches. If you're an engineer driven by the desire to solve problems and make a difference, you're in the right place! Our approach is simple — empower engineers with the best tools possible to make an impact within their industry. We're on the lookout for engineers who thrive on ownership and freedom, possessing not just technical prowess, but also exceptional leadership skills. Our ideal candidates are hands-on leaders who don't just talk the talk but also walk the walk, designing and building solutions that push the boundaries of cloud computing. As a Senior Data Engineer at Rearc, you will be at the forefront of driving technical excellence within our data engineering team. Your expertise in data architecture, cloud-native solutions, and modern data processing frameworks will be essential in designing workflows that are optimized for efficiency, scalability, and reliability. You'll leverage tools like Databricks, PySpark, and Delta Lake to deliver cutting-edge data solutions that align with business objectives. Collaborating with cross-functional teams, you will design and implement scalable architectures while adhering to best practices in data management and governance . Building strong relationships with both technical teams and stakeholders will be crucial as you lead data-driven initiatives and ensure their seamless execution. What You Bring 8+ years of experience in data engineering, showcasing expertise in diverse architectures, technology stacks, and use cases. Strong expertise in designing and implementing data warehouse and data lake architectures, particularly in AWS environments. Extensive experience with Python for data engineering tasks, including familiarity with libraries and frameworks commonly used in Python-based data engineering workflows. Proven experience with data pipeline orchestration using platforms such as Airflow, Databricks, DBT or AWS Glue. Hands-on experience with data analysis tools and libraries like Pyspark, NumPy, Pandas, or Dask. Proficiency with Spark and Databricks is highly desirable. Experience with SQL and NoSQL databases, including PostgreSQL, Amazon Redshift, Delta Lake, Iceberg and DynamoDB. In-depth knowledge of data architecture principles and best practices, especially in cloud environments. Proven experience with AWS services, including expertise in using AWS CLI, SDK, and Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or AWS CDK. Exceptional communication skills, capable of clearly articulating complex technical concepts to both technical and non-technical stakeholders. Demonstrated ability to quickly adapt to new tasks and roles in a dynamic environment. What You'll Do Strategic Data Engineering Leadership : Provide strategic vision and technical leadership in data engineering, guiding the development and execution of advanced data strategies that align with business objectives. Architect Data Solutions : Design and architect complex data pipelines and scalable architectures, leveraging advanced tools and frameworks (e.g., Apache Kafka, Kubernetes) to ensure optimal performance and reliability. Drive Innovation : Lead the exploration and adoption of new technologies and methodologies in data engineering, driving innovation and continuous improvement across data processes. Technical Expertise : Apply deep expertise in ETL processes, data modelling, and data warehousing to optimize data workflows and ensure data integrity and quality. Collaboration and Mentorship : Collaborate closely with cross-functional teams to understand requirements and deliver impactful data solutions—mentor and coach junior team members, fostering their growth and development in data engineering practices. Thought Leadership : Contribute to thought leadership in the data engineering domain through technical articles, conference presentations, and participation in industry forums. Some More About Us Founded in 2016, we pride ourselves on fostering an environment where creativity flourishes, bureaucracy is non-existent, and individuals are encouraged to challenge the status quo. We're not just a company; we're a community of problem-solvers dedicated to improving the lives of fellow software engineers. Our commitment is simple - finding the right fit for our team and cultivating a desire to make things better. If you're a cloud professional intrigued by our problem space and eager to make a difference, you've come to the right place. Join us, and let's solve problems together!

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Description The Amazon Pricing & Promotions Analytics team seeks a Senior Data Engineer to build automated pricing solutions that deliver competitive prices to customers. In this role, you will design and implement data analytics solutions that drive pricing decisions while creating automated anomaly detection systems to monitor pricing effectiveness. Your responsibilities include analyzing customer behavior patterns to inform pricing strategy and developing actionable insights for senior leadership to guide business decisions. You will partner with stakeholders to implement data-driven pricing recommendations. Your work will directly impact Amazon's pricing strategy and customer experience. The position requires expertise in data analysis and a track record of translating complex data into clear business recommendations. Key job responsibilities Design and implement modern data pipelines and business intelligence solutions Configure and optimize AWS services including EC2, RDS, Redshift, Kinesis, EMR, and Lambda Create scalable data architecture to support analytics, data science, and customer reporting needs Collaborate with technology teams to develop ETL processes across multiple data sources Automate reporting systems and enhance self-service analytics capabilities for customers About The Team The Pricing and Promotions team drives Amazon's success by developing automated pricing systems and enabling data-driven decisions at scale. We optimize pricing strategies to benefit customers and sellers while considering both immediate and long-term business impacts. Our scope spans the entire customer journey, from product sourcing to customer experience, ensuring pricing decisions enhance Amazon's competitive advantage. Basic Qualifications 3+ years of data engineering experience 4+ years of SQL experience Experience with data modeling, warehousing and building ETL pipelines Preferred Qualifications Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) Bachelor's degree Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Karnataka - A66 Job ID: A2966634 Show more Show less

Posted 1 week ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Description Amazon is a place where data drives most of our decision-making. Analytics, Operations & Programs (AOP) team is looking for a dynamic data engineer who can be innovative, strong problem solver and can lead the implementation of the analytical data infrastructure that will guide the decision making. As a Data Engineer, you think like an entrepreneur, constantly innovating and driving positive change, but more importantly, you consistently deliver mind-boggling results. You're a leader, who uses both quantitative and qualitative methods to get things done. And on top of it all, you're someone who wonders "What if?" and then seeks out the solution. This position offers exceptional opportunities to grow their technical and non-technical skills. You have the opportunity to really make a difference to our business by inventing, enhancing and building world class systems, delivering results, working on exciting and challenging projects. As a Data Engineer, you are responsible for analyzing large amounts of business data, solve real world problems, and develop metrics and business cases that will enable us to continually delight our customers worldwide. This is done by leveraging data from various platforms such as Jira, Portal, Salesforce. You will work with a team of Product Managers, Software Engineers and Business Intelligence Engineers to automate and scale the analysis, and to make the data more actionable to manage business at scale. You will own many large datasets, implement new data pipelines that feed into or from critical data systems at Amazon. You must be able to prioritize and work well in an environment with competing demands. Successful candidates will bring strong technical abilities combined with a passion for delivering results for customers, internal and external. This role requires a high degree of ownership and a drive to solve some of the most challenging data and analytic problems in retail. Candidates must have demonstrated ability to manage large-scale data modeling projects, identify requirements and tools, build data warehousing solutions that are explainable and scalable. In addition to the technical skills, a successful candidate will possess strong written and verbal communication skills and a high intellectual curiosity with ability to learn new concepts/frameworks and technology rapidly as changes arise. Key job responsibilities Design, implement and support an analytical data infrastructure Managing AWS resources including EC2, EMR, S3, Glue, Redshift, etc. Interface with other technology teams to extract, transform, and load data from a wide variety of data sources using SQL and AWS big data technologies Explore and learn the latest AWS technologies to provide new capabilities and increase efficiency Collaborate with Data Scientists and Business Intelligence Engineers (BIEs) to recognize and help adopt best practices in reporting and analysis Help continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers Maintain internal reporting platforms/tools including troubleshooting and development. Interact with internal users to establish and clarify requirements in order to develop report specifications. Work with Engineering partners to help shape and implement the development of BI infrastructure including Data Warehousing, reporting and analytics platforms. Contribute to the development of the BI tools, skills, culture and impact. Write advanced SQL queries and Python code to develop solutions A day in the life This role requires you to live at the intersection of data, software, and analytics. We leverage a comprehensive suite of AWS technologies, with key tools including S3, Redshift, DynamoDB, Lambda, API's, Glue. You will drive the development process from design to release. Managing data ingestion from heterogeneous data sources, with automated data quality checks. Creating scalable data models for effective data processing, storage, retrieval, and archiving. Using scripting for automation and tool development, which is scalable, reusable, and maintainable. Providing infrastructure for self serve analytics and science use cases. Using industry best practices in building CI/CD pipelines About The Team AOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace. AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession. We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities. We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams Basic Qualifications 1+ years of data engineering experience Experience with data modeling, warehousing and building ETL pipelines Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) Experience with one or more scripting language (e.g., Python, KornShell) Preferred Qualifications Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ASSPL - Karnataka Job ID: A2904529 Show more Show less

Posted 1 week ago

Apply

9.0 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

Skill: Data Engineer Role: T2, T1 Key Responsibility Data Engineer Must have 9+ years of experience in below mentioned skills. Must Have: Big Data Concepts , Python(Core Python- Able to write code), SQL, Shell Scripting, AWS S3 Good to Have: Event-driven/AWA SQS, Microservices, API Development, Kafka, Kubernetes, Argo, Amazon Redshift, Amazon Aurora Show more Show less

Posted 1 week ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position: Sr. Tableau Developer Location: Pune/Indore Full Time Opportunity Experience: 6+ years of experience in Tableau development JD Key Responsibilities: Build and maintain complex Tableau dashboards with drill-down capabilities, filters, actions, and KPI indicators. Write advanced calculations like Level of Detail (LOD) expressions to address business logic such as aggregations at different dimensions. Design and implement table calculations for running totals, percent change, rankings, etc. Perform data blending and joins across multiple sources, ensuring data accuracy and integrity. Optimize Tableau workbook performance by managing extracts, minimizing dashboard load time, and tuning calculations. Use parameters , dynamic filters , and action filters for interactive user experiences. Design dashboard wireframes and prototypes using Tableau or other tools like Figma. Manage publishing, scheduling, and permissions in Tableau Server/Cloud . Collaborate with data engineering to design performant, scalable data sources. Document data logic, dashboard specs, and technical workflows for governance. Provide mentorship and technical guidance to junior Tableau developers. Experience in any BI Reporting Tool like Power BI, Looker, Quicksight, Alteryx is a Plus Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, Analytics, or a related field 6+ years of experience in Tableau development Tableau Desktop Certified Professional (preferred) Experience with enterprise BI projects and stakeholder engagement SQL proficiency : Ability to write complex joins, CTEs, subqueries, and window functions. Experience working with large datasets in tools like: Snowflake, Amazon Redshift, Google BigQuery, Azure Synapse, or SQL Server Data preparation tools experience (preferred but not required): Tableau Prep, Alteryx, dbt, or equivalent Knowledge of Tableau Server/Cloud administration (publishing, permissions, data source refreshes) Show more Show less

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

About Gameskraft - Established in 2017, Gameskraft has become one of India’s fastest-growing companies. We are building the world's most-loved online gaming ecosystem - one game at a time. Started by a group of passionate gamers, we have grown from a small team of five members to a large family of 600+ Krafters, working out of our office in Prestige Tech Park, Bangalore. Our short-term success lies in the fact that we strive to focus on building a safe, secure,and responsible gaming environment for everyone. Our vision is to create unmatched experiences every day, everywhere. We set the highest benchmarks in the industry in terms of design, technology, and intuitiveness. We are also the industry’s only ISO 27001and ISO 9001 certified gaming company. About the role - We are hiring a Senior Data Engineer at Gameskraft, one of India's fastest-growing gaming companies, to build and scale a robust data platform. The role involves designing and optimizing data pipelines, developing scalable infrastructure, and ensuring seamless data accessibility for business insights. Key Responsibilities: Building and optimizing big data pipelines, architectures, and datasets to handle large-scale data. Enhancing infrastructure for scalability, automation, and data delivery improvements. Developing real-time and batch processing solutions using Kafka, Spark, and Airflow. Ensuring data governance, security compliance, and high availability. Collaborating with product, business, and analytics teams to support data needs. Tech Stack: Big Data Tools: Spark, Kafka, Databricks (Delta Tables), ScyllaDB, Redshift Data Pipelines & Workflow: Airflow, EMR, Glue, Athena Programming: Java, Scala, Python Cloud & Storage: AWS Databases: SQL, NoSQL (ScyllaDB, OpenSearch) Backend: Spring Boot What we expect you will bring to the table: 1. Cutting-Edge Technology & Scale At Gameskraft, you will be working on some of the most advanced big data technologies, including Databricks Delta Tables, ScyllaDB, Spark, Kafka, Airflow, and Spring Boot. Our systems handle billions of data points daily, ensuring real-time analytics and high-scale performance. If you’re passionate about big data, real-time streaming, and cloud computing, this role offers the perfect challenge. 2. Ownership & Impact Unlike rigid corporate structures, Gameskraft gives engineers complete freedom and ownership to design, build, and optimize large-scale data pipelines. Your work directly impacts business decisions, game fairness, and player experience, ensuring data is actionable and insightful. 3. High-Growth, Fast-Paced Environment We are one of India’s fastest-growing gaming companies, scaling rapidly since 2017. You will be part of a dynamic team that moves fast, innovates continuously, and disrupts the industry with cutting-edge solutions. 4. Strong Engineering Culture We value technical excellence, continuous learning, and deep problem-solving. We encourage engineers to experiment, contribute, and grow, making this an ideal place for those who love tackling complex data engineering challenges. Why Join Gameskraft? Work on high-scale, real-time data processing challenges. Own end-to-end design and implementation of data pipelines. Collaborate with top-tier engineers and data scientists. Enjoy a fast-growing and financially stable company. Freedom to innovate and contribute at all levels. Work Culture A true startup culture - young, fast paced, where you are driven by personal ownership of solving challenges that help you grow fast Focus on innovation, data orientation, being results driven, taking on big goals, and adapting fast A high performance, meritocratic environment, where we share ideas, debate and grow together with each new product Massive and direct impact on the work you do. Growth through solving dynamic challenges Leveraging technology & analytics to solve large scale challenges Working with cross functional teams to create great product and take them to market Rub shoulders with some of the brightest & most passionate people in the gaming & consumer internet industry Compensation & Benefits Attractive compensation and ESOP packages INR 5 Lakh medical insurance cover for yourself and your family Fair & transparent performance appraisals An attractive Car Lease policy Relocation benefits A vibrant office space with fully stocked pantries. And your lunch is on us! Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

It's fun to work in a company where people truly BELIEVE in what they are doing! We're committed to bringing passion and customer focus to the business. Job Description Data Engineer (AWS) Role : DE Engineer (AWS) Experience : 3-5 years (3+ years of experience with AWS cloud) Education : BE/B. Tech/M. Tech Location : Bangalore/India We are currently seeking an experienced Data Support Engineer with a focus on AWS, Snowflake, Hadoop, Spark, and Python to join our Support team. The ideal candidate will have a solid technical background, strong problem-solving skills, and hands-on experience in troubleshooting and supporting data engineering systems. Responsibilities Include Hands-on experience with Hadoop, Spark with Python on AWS. Provide technical support for data engineering systems, addressing user queries, and resolving issues related to data pipelines, AWS services, Snowflake, Hadoop, and Spark. Investigate and troubleshoot issues in data pipelines, identifying root causes and implementing solutions to prevent recurrence. Experience with a range of big data architectures like Hadoop,Spark,Kafka,Hive or other big data technologies. Effectively manage and resolve incidents related to data processing, ensuring minimal downtime and optimal system performance. Collaborate with cross-functional teams to prioritize and address critical issues promptly. Experience in Tuning and Optimizing Spark jobs Knowledge on Terraform templates for Infrastructure provisioning on AWS (or cloud formation templates) Possess minimum 3+ years of in BI/DW development experience with Data Model Architecture/Design. Should have a good understanding of functional programming concepts. Good knowledge of Python with experience of production grade Python projects. Continuous Integration, branching and merging, pair programming, code reviews, unit testing, agile methodologies (Scrum), Design Patterns. Knowledge on CI/CD implementation like AWS Code Commit, Code Deploy for CI/CD pipelines (Git knowledge preferable) Knowledge on Scheduling Tools and techniques on Hadoop/EMR. Excellent written and verbal communication skills. Strong analytical and project management skills. Technical Essentials Proven experience in providing technical support for data engineering systems. Strong understanding of AWS services, including S3, Glue, Redshift, EMR, Lambda, Athena, and Step Functions. Hands-on experience supporting Snowflake, Hadoop, Spark, and Python in a production environment. Familiarity with data modeling, optimization, and performance tuning. Excellent problem-solving skills and the ability to analyze and diagnose complex technical issues. Experience with incident management, including prioritization and resolution procedures. Strong communication and collaboration skills for working with cross-functional teams. Knowledge of best practices in cloud-based data engineering and support. Preferred AWS Certified Solutions Architect – Associate Personal Specifications Self-motivated team player with strong analytical, relationship management with effective written and oral communication skills. If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us! Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest! Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : AWS Administration Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming. 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less

Posted 1 week ago

Apply

7.5 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Cloud Data Architecture Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Solutions Architect - Lead, you will analyze, design, code, and test multiple components of application code. You will perform maintenance, enhancements, and/or development work, contributing to the overall success of the projects. Roles & Responsibilities: Design and develop the overall architecture of our digital data platform using AWS services. Create and maintain cloud infrastructure designs and architectural diagrams. Collaborate with stakeholders to understand business requirements and translate them into scalable AWS-based solutions. Evaluate and recommend AWS technologies, services, and tools for the platform. Ensure the scalability, performance, security, and cost-effectiveness of the AWS-based platform. Lead and mentor the technical team in implementing architectural decisions and AWS best practices. Develop and maintain architectural documentation and standards for AWS implementations. Stay current with emerging AWS technologies, services, and industry trends. Optimize existing AWS infrastructure for performance and cost. Implement and manage disaster recovery and business continuity plans. Professional & Technical Skills: Minimum 8 years of experience in IT architecture, with at least 5 years in a solutions architect role. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). Experience in Infrastructure as Code (e.g., CloudFormation, Terraform). Exposure to Continuous Integration/Continuous Deployment (CI/CD) pipelines. Experience in Containerization technologies (e.g., Docker, Kubernetes). Proficiency in multiple programming languages and frameworks. AWS Certified Solutions Architect - Professional certification required. Additional Information: The candidate should have a minimum of 5 years of experience in solutions architect role. This position is based at our Hyderabad office. A 15 years full time education is required (Bachelor of Engineering in Electronics/Computer Science, or any related stream). Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : AWS Architecture Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code 4. quality. 5. Collaborate with data scientists and analysts to implement data processing pipelines 6. Participate in architecture discussions and contribute to technical decision-making 7. Ensure the scalability, reliability, and performance of Python applications on AWS 8. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less

Posted 1 week ago

Apply

7.5 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Software Development Lead Project Role Description : Develop and configure software systems either end-to-end or for a specific stage of product lifecycle. Apply knowledge of technologies, applications, methodologies, processes and tools to support a client, project or entity. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming. 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming. 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 5 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Python (Programming Language) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Software Engineer with Python expertise, you will develop data-driven applications on AWS. Responsible for the creation of scalable data pipelines and algorithms to process and deliver actionable vehicle data insights. Roles & Responsibilities: 1. Lead the design and development of Python based applications and services 2. Architect and implement cloud-native solutions using AWS services 3. Mentor and guide the Python development team, promoting best practices and code quality 4. Collaborate with data scientists and analysts to implement data processing pipelines 5. Participate in architecture discussions and contribute to technical decision-making 6. Ensure the scalability, reliability, and performance of Python applications on AWS 7. Stay current with Python ecosystem developments, AWS services, and industry best practices Professional & Technical Skills: 1. Python Programming 2. Web framework expertise (Django, Flask, or FastAPI) 3. Data processing and analysis 4. Database technologies (SQL and NoSQL) 5. API development 6. Significant experience working with AWS Lambda 7. AWS services (e.g., EC2, S3, RDS, Lambda, SageMaker, EMR) with Any AWS certification is a plus. 8. Infrastructure as Code (e.g., AWS CloudFormation, Terraform) 9. Test-Driven Development (TDD) 10. DevOps practices 11. Agile methodologies. 12. Experience with big data technologies and data warehousing solutions on AWS (e.g., Redshift, EMR, Athena). 13. Strong knowledge of AWS platform and services (e.g., EC2, S3, RDS, Lambda, API Gateway, VPC, IAM). Additional Information: 1. The candidate should have a minimum of 3 years of experience in Python Programming. 2. This position is based at our Hyderabad office 3. A 15 years full time education is required (Bachelor of computer science, or any related stream. master’s degree preferred.) Show more Show less

Posted 1 week ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Key Responsibilities Partner with product managers, engineers, and business stakeholders to define KPIs and success metrics for Creator Success Create comprehensive dashboards and self-service analytics tools using QuickSight, Tableau, or similar BI platforms Perform deep-dive analysis on customer behavior, content performance, and livestream engagement patterns Design, build, and maintain robust ETL/ELT pipelines to process large volumes of streaming and batch data from Creator Success platform Develop and optimize data warehouses, data lakes, and real-time analytics systems using AWS services (Redshift, S3, Kinesis, EMR, Glue) Implement data quality frameworks and monitoring systems to ensure data accuracy and reliability Build automated data validation and alerting mechanisms for critical business metrics Generate actionable insights from complex datasets to drive product roadmap and business strategy Required Qualifications Bachelor's degree in Computer Science, Engineering, Mathematics, Statistics, or related quantitative field 3+ years of experience in business intelligence/analytic roles with proficiency in SQL, Python, and/or Scala Strong experience with AWS cloud services (Redshift, S3, EMR, Glue, Lambda, Kinesis) Expertise in building and optimizing ETL pipelines and data warehousing solutions Proficiency with big data technologies (Spark, Hadoop) and distributed computing frameworks Experience with business intelligence tools (QuickSight, Tableau, Looker) and data visualization best practices Collaborative approach with cross-functional teams including product, engineering, and business teams Customer-obsessed mindset with focus on delivering high-quality, actionable insights Non-Negotiable Skills High proficiency in SQL and Python Expertise in building and optimizing ETL pipelines and data warehousing solutions Experience with business intelligence tools (QuickSight, Tableau, Looker) and data visualization best practices Experience in working with cross-functional teams including product, engineering, and business teams Experience with AWS cloud services (Redshift, S3, EMR) Show more Show less

Posted 1 week ago

Apply

12.0 years

0 Lacs

Bengaluru, Karnataka, India

Remote

Linkedin logo

Position: Data Engineer About Cimpress: Led by founder and CEO Robert Keane, Cimpress invests in and helps build customer-focused, entrepreneurial mass customization businesses. Through the personalized physical (and digital) products these companies create,we empower over 17 million global customers to make an impression. Last year, Cimpress generated $3.5B in revenue through customized print products, signage, apparel, packaging and more. The Cimpress family includes a dynamic, international group of businesses and central teams, all working to solve problems, build businesses, innovate and improve. Business Unit: BuildAsign BuildASign is a leading online provider of canvas wall décor, signage and other large format products. Since the company's inception in 2005, BuildASign has set out to empower every individual and business to connect with those that matter most to them. Their focus is making it easy and affordable for people to share their message or tell their story with custom and personalized products. Over the past 12 years, the Austin, Texas-based company has grown to over 400 employees. Roles & Responsibilities: We’re looking for a skilled Data Engineer with good experience in managing data pipelines and delivering scalable data solutions using modern cloud and CI/CD tools. 2+yrs of experience in handling data volumes and orchestrating / scheduling /monitoring automated Batch & Near Real-Time ETL/data pipelines using CI/CD and Cloud Technologies. (Preferred Tools are DBT and DBT Cloud). Strong expertise in Data Modelling and understanding of Data Warehousing best practices. Experience in different Data Ingestion techniques. Experience programming in Python and strong expertise in SQL. Experience with DataOps processes for data with GitLab CI/CD. Experience in Understanding, Building and Maintaining Data Pipelines. Experience with Business Intelligence or reporting tools such as Looker, Tableau, Power BI, Qlik. Strong experience with MPP data warehouse systems (such as Amazon Redshift, Snowflake etc.) Experience with cloud services in AWS (Preferred), Azure or GCP Experience adopting data solution development best practices (e.g., modularization, testing, refactoring, etc.) Strong problem-solving skills, understanding of data structures and algorithms. Experience in working with Business stakeholders and requirement discussions. Experience contributing to the wider technical community through collaboration, coaching, and mentoring of other technologists Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams Remote First-Culture: In 2020, Cimpress adopted a Remote-First operating model and culture. We heard from our team members that having the freedom, autonomy and trust in each other to work from home and, the ability to operate when they are most productive, empowers everyone to be their best and most brilliant self. Cimpress also provides collaboration spaces for team members to work physically together when it's safe to do so or believe in office working will deliver the best results. Currently we are enabled to hire remote team members in over 20 US States as well as several countries in Europe: Spain, Germany, UK, Czech Republic, the Netherlands and Switzerland. More information about the organization can be found in the below link: https://cimpress.com https://www.linkedin.com/company/cimpress/ https://twitter.com/Cimpress Want to explore more about our brands? Please visit: https://cimpress.com/brands/explore-our-brands/ Show more Show less

Posted 1 week ago

Apply

20.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Senior Data Solution Architect Job Summary: The Senior Data Solution Architect is a visionary and technical leader responsible for designing and guiding enterprise-scale data solutions. Leveraging 20+ years of experience, this individual works closely with business and IT stakeholders to deliver scalable, secure, and high-performing data architectures that support strategic goals, data-driven innovation, and digital transformation. This role encompasses solution design, platform modernization, cloud data architecture, and deep integration with enterprise systems. Key Responsibilities: Solution Architecture & Design Lead the end-of-the-end architecture of complex data solutions across domains including analytics, AI/ML, MDM, and real-time processing. Design robust, scalable, and future-ready data architectures using modern technologies (e.g., cloud data platforms, streaming, NoSQL, graph databases). Deliver solutions that balance performance, scalability, security, and cost-efficiency. Enterprise Data Integration Architect seamless data integration across legacy systems, SaaS platforms, IoT, APIs, and third-party data sources. Define and implement enterprise-wide ETL/ELT strategies using tools like Informatica, Talend, DBT, Azure Data Factory, or AWS Glue. Support real-time and event-driven architecture with tools such as Kafka, Spark Streaming, or Flink. Cloud Data Platforms & Infrastructure Design cloud-native data solutions on AWS, Azure, or GCP (e.g., Redshift, Snowflake, BigQuery, Databricks, Synapse). Lead cloud migration strategies from legacy systems to modern, cloud-based data architectures. Define standards for cloud data governance, cost management, and performance optimization. Data Governance, Security & Compliance Partner with governance teams to enforce enterprise data governance frameworks. Ensure solutions comply with regulations such as GDPR, HIPAA, CCPA, and industry-specific mandates. Embed security and privacy by design in data architectures (encryption, role-based access, masking, etc.). Technical Leadership & Stakeholder Engagement Serve as a technical advisor to CIOs, CDOs, and senior business executives on data strategy and platform decisions. Mentor architecture and engineering teams; provide guidance on solution patterns and best practices. Facilitate architecture reviews, proof-of-concepts (POCs), and technology evaluations. Innovation & Continuous Improvement Stay abreast of emerging trends in data engineering, AI, data mesh, data fabric, and edge computing. Evaluate and introduce innovative tools and patterns (e.g., serverless data pipelines, federated data access). Drive architectural modernization, legacy decommissioning, and platform simplification. Qualifications: Education: Bachelor’s degree in computer science, Engineering, Information Systems, or related field; Master’s or MBA preferred. Experience: 20+ years in IT with at least 10 years in data architecture or solution architecture roles. Demonstrated experience in large-scale, complex data platform architecture and enterprise transformations. Deep experience with multiple database technologies (SQL, NoSQL, columnar, time series). Strong programming/scripting background (e.g., Python, Scala, Java, SQL). Proven experience architecting on at least one major cloud provider (AWS, Azure, GCP). Familiarity with DevOps, CI/CD, and DataOps practices. Preferred Certifications: AWS/Azure/GCP Solution Architect (Professional level preferred) TOGAF or Zachman Framework Certification Snowflake/Databricks Certified Architect CDMP (Certified Data Management Professional) or DGSP Key Competencies: Strategic and conceptual thinking with the ability to translate business needs into technical solutions. Exceptional communication, presentation, and negotiation skills. Leadership in cross-functional teams and matrix environments. Deep understanding of business processes, data monetization, and digital strategy. Success Indicators: Delivery of transformative data platforms that enhance analytics and decision-making. Improved data integration, quality, and access across the enterprise. Successful migration to cloud-native or hybrid architectures. Reduction of technical debt and legacy system dependencies. Increased reuse of solution patterns, accelerators, and frameworks. Show more Show less

Posted 1 week ago

Apply

Exploring Redshift Jobs in India

The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Mumbai
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.

Career Path

In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect

Related Skills

Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming

Interview Questions

  • What is Amazon Redshift and how does it differ from traditional databases? (basic)
  • How does data distribution work in Amazon Redshift? (medium)
  • Explain the difference between SORTKEY and DISTKEY in Redshift. (medium)
  • How do you optimize query performance in Amazon Redshift? (advanced)
  • What is the COPY command in Redshift used for? (basic)
  • How do you handle large data sets in Redshift? (medium)
  • Explain the concept of Redshift Spectrum. (advanced)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you monitor and manage Redshift clusters? (advanced)
  • Can you describe the architecture of Amazon Redshift? (medium)
  • What are the best practices for data loading in Redshift? (medium)
  • How do you handle concurrency in Redshift? (advanced)
  • Explain the concept of vacuuming in Redshift. (basic)
  • What are Redshift's limitations and how do you work around them? (advanced)
  • How do you scale Redshift clusters for performance? (medium)
  • What are the different node types available in Amazon Redshift? (basic)
  • How do you secure data in Amazon Redshift? (medium)
  • Explain the concept of Redshift Workload Management (WLM). (advanced)
  • What are the benefits of using Redshift over traditional data warehouses? (basic)
  • How do you optimize storage in Amazon Redshift? (medium)
  • What is the difference between Redshift and Redshift Spectrum? (medium)
  • How do you troubleshoot performance issues in Amazon Redshift? (advanced)
  • Can you explain the concept of columnar storage in Redshift? (basic)
  • How do you automate tasks in Redshift? (medium)
  • What are the different types of Redshift nodes and their use cases? (basic)

Conclusion

As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies