Jobs
Interviews

25 Aws Athena Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Role Overview: YASH Technologies is seeking AWS Professionals with expertise in AWS services such as Glue, Pyspark, SQL, Databricks, Python, and more. As an AWS Data Engineer, you will be responsible for designing, developing, testing, and supporting data pipelines and applications. This role requires a degree in computer science, engineering, or related fields along with strong experience in data integration and pipeline development. Key Responsibilities: - Design, develop, test, and support data pipelines and applications using AWS services like Glue, Pyspark, SQL, Databricks, Python, etc. - Work with a mix of Apache Spark, Glue, Kafka, Kinesis, and Lambda in S3 Redshift, RDS, MongoDB/DynamoDB ecosystems. - Utilize SQL in the development of data warehouse projects/applications (Oracle & SQL Server). - Develop in Python especially in PySpark in AWS Cloud environment. - Work with SQL and NoSQL databases like MySQL, Postgres, DynamoDB, Elasticsearch. - Manage workflow using tools like Airflow. - Utilize AWS cloud services such as RDS, AWS Lambda, AWS Glue, AWS Athena, EMR. - Familiarity with Snowflake and Palantir Foundry is a plus. Qualifications Required: - Bachelor's degree in computer science, engineering, or related fields. - 3+ years of experience in data integration and pipeline development. - Proficiency in Python, PySpark, SQL, and AWS. - Strong experience with data integration using AWS Cloud technologies. - Experience with Apache Spark, Glue, Kafka, Kinesis, Lambda, S3 Redshift, RDS, MongoDB/DynamoDB ecosystems. - Hands-on experience with SQL in data warehouse projects/applications. - Familiarity with SQL and NoSQL databases. - Knowledge of workflow management tools like Airflow. - Experience with AWS cloud services like RDS, AWS Lambda, AWS Glue, AWS Athena, EMR. Note: The JD also highlights YASH Technologies" empowering work environment that promotes career growth, continuous learning, and a positive, inclusive team culture grounded in flexibility, trust, transparency, and support for achieving business goals.,

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You will be joining the Visualization Centre of Excellence (CoE) at the Bain Capability Network (BCN), where you will work closely with global Bain case teams, Bain Partners, and end-clients to provide them with data analytics and business intelligence support using advanced tools such as SQL, Python, Azure, AWS, Tableau, PowerBI, and Alteryx. The CoE serves as a central hub for all case requests related to converting data into insightful visualizations. **Key Responsibilities:** - Responsible for end-to-end handling of the entire process, including requirement gathering, data cleaning, processing, and automation - Design, build, and maintain infrastructure and systems for Extraction, Transformation, and Storage of large datasets for analysis - Work as an expert on specific platforms/tools/languages (Snowflake/Azure/AWS/Python/SQL) either individually or by leading teams to design and deliver impactful insights - Gather requirements and business process knowledge to transform data according to end users" needs - Investigate data to identify issues within ETL pipelines, propose solutions, and ensure scalable and maintainable data architecture - Apply knowledge of data analysis tools like SnowPark, Azure Data Bricks, AWS Athena, Alteryx, etc. to support case teams with KPI analysis - Prepare documentation for reference and support product development by building scalable and automated pipelines and algorithms - Manage internal and external stakeholders, provide expertise in data management and tool proficiency, and lead client/case team calls to communicate insights effectively - Stay updated on statistical, database, and data warehousing tools and techniques - Provide feedback, conduct performance discussions, and assist in team management activities **Qualifications Required:** - Graduation/Post-Graduation from a top-tier college with 5-7 years of relevant work experience in Data Management, Business Intelligence, or Business Analytics - Concentration in a quantitative discipline such as Statistics, Mathematics, Engineering, Computer Science, Econometrics, Business Analytics, or Market Research preferred - Minimum 5+ years of Database development experience on Cloud-based platform Snowflake - Hands-on experience with ETL processing via SnowPark and Snowflake - Proficiency in Python, Advanced SQL queries, Azure, AWS, and data modeling principles - Motivated, collaborative team player with excellent communication skills and the ability to prioritize projects and drive them to completion under tight deadlines - Ability to generate realistic answers, recommend solutions, and manage multiple competing priorities effectively **Good to Have:** - Experience in building Custom GPTs and AI Agents - Knowledge of Environment creation and management - Familiarity with CI/CD pipelines: GitHub, Docker, and containerization Please note that the company, Bain & Company, is consistently recognized as one of the world's best places to work, fostering diversity, inclusion, and collaboration to build extraordinary teams where individuals can thrive both professionally and personally.,

Posted 5 days ago

Apply

4.0 - 8.0 years

0 Lacs

haryana

On-site

You will be working at Paras Twin Tower, Gurgaon as a full-time employee for Falcon, a Series-A funded cloud-native, AI-first banking technology & processing platform. Falcon specializes in assisting banks, NBFCs, and PPIs to efficiently launch cutting-edge financial products like credit cards, credit lines on UPI, prepaid cards, fixed deposits, and loans. Since its inception in 2022, Falcon has processed over USD 1 billion in transactions, collaborated with 12 of India's top financial institutions, and generated revenue exceeding USD 15 million. The company is supported by prominent investors from Japan, the USA, and leading Indian ventures and banks. To gain more insights about Falcon, visit https://falconfs.com/. As an Intermediate Data Engineer with 5-7 years of experience, your responsibilities will include designing, developing, and maintaining scalable ETL processes using open source tools and data frameworks such as AWS Glue, AWS Athena, Redshift, Apache Kafka, Apache Spark, Apache Airflow, and Pentaho Data Integration (PDI). You will also be accountable for designing, creating, and managing data lakes and data warehouses on the AWS cloud, optimizing data pipeline architecture, formulating complex SQL queries for big data processing, collaborating with product and engineering teams to develop a platform for data modeling and machine learning operations, implementing data structures and algorithms to meet functional and non-functional requirements, ensuring data privacy and compliance, developing processes for monitoring and alerting on data quality issues, and staying updated with the latest data engineering trends by evaluating new open source technologies. To qualify for this role, you must have a Bachelor's or Master's degree in Computer Science or MCA from a reputable institute, at least 4 years of experience in a data engineering role, proficiency in Python, Java, or Scala for data processing (Python preferred), a deep understanding of SQL and analytical data warehouses, experience with database frameworks like PostgreSQL, MySQL, and MongoDB, knowledge of AWS technologies such as Lambda, Athena, Glue, and Redshift, experience implementing ETL or ELT best practices at scale, familiarity with data pipeline tools like Airflow, Luigi, Azkaban, dbt, proficiency with Git for version control, familiarity with Linux-based systems, cloud services (preferably AWS), strong analytical skills, and the ability to work in an agile and collaborative team environment. Preferred skills for this role include certification in any open source big data technologies, expertise in Apache Hadoop, Apache Hive, and other open source big data technologies, familiarity with data visualization tools like Apache Superset, Grafana, Tableau, experience in CI/CD processes, and knowledge of containerization technologies like Docker or Kubernetes. If you are someone with these skills and experience, we encourage you to explore this opportunity further. Please note that this job description is for an Intermediate Data Engineer role with key responsibilities and qualifications outlined.,

Posted 1 week ago

Apply

4.0 - 10.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Data Scientist / Analyst at Capgemini in Chennai, you will be responsible for modeling vessel behavior, analyzing fleet health, and supporting sustainability metrics through advanced analytics and machine learning. Working with large datasets and cloud-based tools, you will deliver predictive insights to enhance operational efficiency and achieve environmental goals. Your main responsibilities will include modeling vessel behavior and operational patterns, analyzing fleet health to identify trends and areas for improvement, developing predictive maintenance and anomaly detection models, and supporting sustainability initiatives by tracking relevant environmental metrics. You will utilize AWS services like SageMaker and Athena for scalable data processing and analysis, and write efficient data queries and scripts using Python, R, and SQL. Collaboration with cross-functional teams to translate business needs into analytical solutions will also be a key aspect of your role. To be successful in this position, you should have 4 to 10 years of experience in data science, analytics, or a related field. Proficiency in Python, R, and SQL for data analysis and modeling is essential, along with experience in machine learning techniques including predictive maintenance and anomaly detection. Hands-on experience with AWS SageMaker and Athena, a strong understanding of sustainability metrics, excellent analytical thinking, problem-solving skills, and the ability to communicate insights clearly to technical and non-technical stakeholders are also required. Working at Capgemini, you will enjoy flexible work arrangements, a supportive work-life balance, an inclusive and collaborative culture with opportunities for growth, access to cutting-edge technologies and certifications, and the opportunity to contribute to sustainability-focused and high-impact projects. Capgemini is a global business and technology transformation partner with a 55-year heritage and deep industry expertise. With a diverse team of 360,000 members in over 50 countries, Capgemini is focused on unleashing human energy through technology for an inclusive and sustainable future. By leveraging its capabilities in digital, cloud, and data, Capgemini helps organizations accelerate their dual transformation to meet the evolving needs of customers and citizens.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Architect, you will be responsible for leading Data related projects in the field of Reporting and Analytics. With over 10 years of relevant work experience, you will design, build, and maintain scalable data lake and data warehouse in the cloud, particularly on Google Cloud Platform (GCP). Your expertise will be crucial in gathering business requirements, analyzing business needs, and defining the BI/DW architecture to deliver technical solutions for complex business and technical requirements. You will create solution prototypes, participate in technology selection, and perform POCs and technical presentations. In this role, you will architect, develop, and test scalable data warehouses and data pipelines architecture using Cloud Technologies on GCP. Your experience in SQL and NoSQL DBMS such as MS SQL Server, MySQL, PostgreSQL, DynamoDB, Cassandra, and MongoDB will be essential. You will design and develop scalable ETL processes, including error handling, and demonstrate proficiency in query and program languages like MS SQL Server, T-SQL, PostgreSQL, MySQL, Python, and R. Additionally, you will prepare data structures for advanced analytics and self-service reporting using tools like MS SQL, SSIS, and SSRS. Experience with cloud-based technologies such as PowerBI, Tableau, Azure Data Factory, Azure Synapse, Azure Data Lake, AWS RedShift, Glue, Athena, AWS Quicksight, and Google Cloud Platform will be beneficial. In addition, familiarity with Agile development environment, pairing DevOps with CI/CD pipelines, and having an AI/ML background are considered good to have for this role. (ref:hirist.tech),

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

This role will involve being the primary SharePoint designer for the General Counsel's Organization (GCO). You will be responsible for managing SharePoint development, which includes utilizing MS Power Tools such as MS Power Apps, Automate, and Power BI. Your duties will also involve maintaining a small number of business-developed MS Access sites and assisting in their transition to more supportable technology. Additionally, you will be tasked with performing system administration for a new ideation tool called BrightIdea and other GCO applications. Your key responsibilities will include developing, maintaining, and supporting the GCO inventory of SharePoint sites across the organization. You will collaborate closely with teams within GCO to comprehend business requirements and drive the MS SharePoint/Power Platform solution to efficiently meet these needs. Furthermore, you will be responsible for maintaining current support for approximately 5 Access databases, and providing support and analysis for transitioning them to a Technology or SharePoint solution. You will also be involved in ongoing maintenance and development of the Service Provider and Oversight Tool, created on the MS Power platform, to perform scheduling, tracking, and reporting of compliance assessments. Moreover, you will provide system administration for the BrightIdea tool, an enterprise tool crucial to GCO Innovation. Additionally, you will coach and troubleshoot with GCO partners on SharePoint, PowerBI, and MS PowerApps, while staying updated on the latest technology and information technology standards. The required qualifications for this role include a Bachelor's degree from a reputed university with 5-8 years of relevant experience. You should possess expert knowledge and experience in developing/working with SharePoint, including SharePoint Designer and InfoPath. Moreover, you must have expert knowledge and experience in the MS Power Tool suite (Power BI, Apps, and Automate) and DAX, along with strong experience in MS Excel and Access. Knowledge of the Application Development lifecycle and strong systems analysis skills are essential. Additionally, you should have strong interpersonal and communication skills, be highly organized, and able to work on multiple priorities. Preferred qualifications include knowledge of technology standards and controls, developing Power BI reporting utilizing various data sources like relational databases, Excel, and SharePoint, as well as knowledge of Ameriprise Data Lake environments, SQL, AWS Athena, and other development protocols. Familiarity with at least one programming language, with Python being preferred, is also desirable. Ameriprise India LLP is a U.S.-based financial planning company with a global presence, providing client-based financial solutions for over 125 years. The firm's focus areas include Asset Management and Advice, Retirement Planning, and Insurance Protection. Join an inclusive and collaborative culture that rewards contributions and offers opportunities to make a difference. If you are talented, driven, and seek to work for an ethical company that cares, consider creating a career at Ameriprise India LLP. This is a full-time position with working hours from 4:45 pm to 1:15 am in the AWMP&S President's Office within the Technology job family group.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

This role requires you to be the primary SharePoint designer for the General Counsel's Organization (GCO), overseeing SharePoint development utilizing MS Power Tools (MS Power Apps, Automate, and Power BI). You will also be responsible for managing a small number of business-developed MS Access sites and facilitating their transition to more sustainable technology. Additionally, you will be involved in the system administration of the BrightIdea ideation tool and other GCO applications. Your responsibilities will include developing, maintaining, and supporting the GCO inventory of SharePoint sites, collaborating with various teams across GCO to understand business requirements and efficiently drive the MS SharePoint/Power Platform solutions. You will also maintain current support for approximately 5 Access databases and provide analysis for transitioning to a Technology or SharePoint solution. Furthermore, you will engage in the ongoing maintenance and development of the Service Provider and Oversight Tool and provide system administration for the BrightIdea tool, crucial for GCO innovation. You will be expected to coach and troubleshoot with GCO partners on SharePoint, Power BI, and MS Power Apps, while staying updated on the latest technology and IT standards. Qualifications: - Bachelor's degree from a reputed university with 5-8 years of relevant experience - Expertise in developing and working with SharePoint, including SharePoint Designer and InfoPath - Proficiency in the MS Power Tool suite (Power BI, Apps, and Automate) and DAX - Strong experience with MS Excel and Access - Knowledge of Application Development lifecycle and strong systems analysis skills - Excellent interpersonal and communication skills - Highly organized and capable of managing multiple priorities Preferred Qualifications: - Familiarity with technology standards and controls - Experience in developing Power BI reporting using various data sources like relational databases, Excel, and SharePoint - Knowledge of Ameriprise Data Lake environments - Familiarity with SQL, AWS Athena, and other development protocols - Proficiency in at least one programming language, with preference for Python About Ameriprise India LLP: Ameriprise India LLP has been offering client-based financial solutions for 125 years, aiding clients in planning and achieving their financial goals. Headquartered in Minneapolis, we are a U.S.-based financial planning company with a global presence, focusing on Asset Management and Advice, Retirement Planning, and Insurance Protection. Join our inclusive and collaborative culture that values your contributions, working alongside talented individuals who share your commitment to excellence. This is an opportunity to excel in your role and make a positive impact both in the workplace and the community. If you are talented, driven, and seeking to work for an ethical company that values its employees, consider building your career at Ameriprise India LLP. Please note: - Job Type: Full-time - Working Hours: 4:45 pm - 1:15 am - Business Unit: AWMPO AWMP&S President's Office - Job Family Group: Technology,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

As a professional services firm affiliated with KPMG International Limited, KPMG entities in India have been serving national and international clients since August 1993. With offices located across India in cities like Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara, and Vijayawada, our team of professionals leverages a global network of firms while maintaining expertise in local laws, regulations, markets, and competition. We are committed to delivering rapid, performance-based, industry-focused, and technology-enabled services that showcase our extensive knowledge of global and local industries, as well as our deep understanding of the Indian business environment. In order to achieve this, we are looking for individuals with expertise in the following technologies and tools: Python, SQL, AWS Lambda, AWS Glue, AWS RDS, AWS S3, AWS Athena, AWS Redshift, AWS EventBridge, PySpark, Snowflake, GIT, Azure DevOps, JIRA, Cloud Computing, Agile methodologies, Automation, and Talend. If you are passionate about working in a dynamic environment that values equal employment opportunities and embraces diverse perspectives, we invite you to join our team at KPMG in India.,

Posted 2 weeks ago

Apply

2.0 - 7.0 years

6 - 16 Lacs

chennai, coimbatore

Hybrid

We are seeking an experienced AWS PySpark Developer with 2-8 years of experience to design, build, and optimize our data pipelines and analytics architecture. The ideal candidate will have a strong background in data wrangling and analysis, with a deep understanding of AWS data services. Key Responsibilities: Design, build, and optimize robust data pipelines and data architecture on the AWS cloud platform. Wrangle, explore, and analyze large datasets to identify trends, answer business questions, and pinpoint areas for improvement. Develop and maintain a next-generation analytics environment, providing a self-service, centralized platform for all data-centric activities. Formulate and implement distributed algorithms for effective data processing and trend identification. Configure and manage Identity and Access Management ( IAM ) on the AWS platform. Collaborate with stakeholders to understand data requirements and deliver effective solutions. Required Skills & Experience: 2-8 years of experience as a Data Engineer or Developer. Proven experience building and optimizing data pipelines on AWS. Proficiency in scripting with Python . Strong working knowledge of: Big Data Tools: AWS Athena. Relational & NoSQL Databases: AWS Redshift and PostgreSQL. Data Pipeline Tools: AWS Glue, AWS Data Pipeline, or AWS Lake Formation. Container Orchestration: Kubernetes, Docker, Amazon ECR/ECS/EKS. Experience with wrangling, exploring, and analyzing data. Strong organizational and problem-solving skills. Preferred Skills: Experience with machine learning tools ( SageMaker, TensorFlow ). Working knowledge of stream processing ( Kinesis, Spark-Streaming ). Experience with analytics and visualization tools ( Tableau, Power BI ). Knowledge of optimizing AWS Redshift performance. Familiarity with SAP Business Objects (BO) .

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

ahmedabad, gujarat

On-site

You are a detail-driven and highly analytical AWS Cost Analyst & Leader with over 3 years of experience, seeking to join our team for a full-time night shift role aligned with the Eastern Standard Time (EST) zone. Your primary responsibility will be to monitor, analyze, and report on AWS usage and billing data using tools like AWS Cost Explorer & CUR. Additionally, you will be required to build and maintain advanced Google Sheets dashboards and models to track cloud expenses and usage trends. As an AWS Cost Analyst, you will need to identify cost anomalies, recommend optimizations, and assist the System/Infra Team in implementing strategies for cost efficiency. Collaboration with DevOps and Engineering teams is essential to ensure accurate cost attribution. You will also be responsible for developing scripts or using automation tools to extract, clean, and analyze cloud billing data. Ensuring that tagging policies and organizational standards are followed for cost accountability will be part of your role, along with assisting in forecasting, budgeting, and monthly reconciliation of cloud spend. Clear communication of cost reports and findings through presentations or documentation is crucial in this position. To be successful in this role, you must have at least 3 years of hands-on experience in cloud cost analysis, preferably in an AWS environment. Expert-level proficiency in Google Sheets, including formulas, pivot tables, charts, and data visualization is required. Strong analytical and mathematical skills, along with excellent written and verbal communication skills, are essential to explain complex data clearly to stakeholders. Familiarity with AWS billing tools such as Cost Explorer, CUR, Budgets, and Tags is necessary. Attention to detail and the ability to work independently in a night-shift role are key attributes for this position. Preferred qualifications include hands-on experience in AWS Cost Explorer & AWS Billing & Cost Management, familiarity with tools like AWS Athena & AWS CloudWatch, good experience in creating & understanding Google sheet formulas, and scripting skills (e.g., JavaScript/TypeScript, SQL) for automation. This role requires working night shifts in the EST Time Zone with flexibility to align with US-based teams. You must be available during core EST business hours. By joining our team, you will have the opportunity to play a key role in cloud cost efficiency and financial governance, work with a collaborative global team, and enjoy competitive compensation and growth opportunities.,

Posted 1 month ago

Apply

13.0 - 17.0 years

0 Lacs

haryana

On-site

As a Technical Support Analyst (Data-Focused) at our company, you will play a crucial role in our growing Analytics/tech-support team by utilizing your strong analytical mindset, Python and SQL skills, and experience with data analysis tools. Your primary responsibilities will include investigating and resolving data inconsistencies, writing and debugging Python scripts, running SQL queries for data investigations, troubleshooting technical problems, and collaborating with various teams to ensure data quality and integrity. Additionally, you will be responsible for building internal tools, maintaining documentation of findings, and processes. The ideal candidate for this role should possess strong Python skills for scripting and automation, proficiency in SQL, hands-on experience with AWS Athena or similar query engines, coding/debugging experience, and a keen attention to detail. It is essential to have a strong analytical and problem-solving mindset and be comfortable working with structured and unstructured data in CLI/terminal environments. Preferred skills that would be nice to have include experience with AWS tools, familiarity with Git and ETL pipelines, exposure to dashboards, and the ability to work with REST APIs. A bachelor's degree in a technical field or equivalent practical experience, along with at least 3 years of experience in a technical analyst or data support role, is required for this position. You should be a strong communicator capable of explaining technical findings to both technical and non-technical audiences, possess a team-first attitude, be proactive, detail-oriented, and solution-driven. Comfort with multitasking between support tickets, tooling, and analysis is a plus. Join our team at Affle, a global technology company focused on delivering consumer recommendations and conversions through mobile advertising. Affle's platform aims to enhance marketing ROI through contextual mobile ads and reducing digital ad fraud. Affle powers consumer journeys through its Affle2.0 Consumer Platforms Stack, including Appnext, Jampp, MAAS, mediasmart, RevX, Vizury, and YouAppi. About YouAppi, a programmatic mobile app retargeting company that focuses on app marketing funnel approaches to activate valuable users and grow app revenue. YouAppi offers tailored retargeting solutions with flexible integrations, optimizations, and transparent reporting. For more information, visit our website at www.affle.com and https://youappi.com/.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

The role involves working with customer segments and behavioral data to identify opportunities for enhancing approval rates, reducing friction, and balancing risk across different user profiles and markets. You should have strong technical proficiency in analytics tools such as Tableau, SuperSet, etc., and SQL/NoSQL databases. Advanced SQL skills for complex queries, performance tuning, and transaction-safe data manipulation are essential. Proficiency with AWS Athena for querying large datasets in S3 using optimized, cost-effective practices is also required. Additionally, you should possess business-oriented thinking skills that allow you to connect analytical work with broader business outcomes, understanding how data impacts approval rates, revenue, and customer experience. Collaboration and cross-functionality are key, as you will be working across teams like Marketing, Monetization, Engineering, Finance, and Customer Care to foster shared understanding and collective problem-solving. Nice to have qualifications include experience with online subscription business models, knowledge of recurring revenue dynamics, churn management, and retention strategies, as well as an understanding of key metrics like LTV and MRR crucial for sustainable subscription growth. External communication skills are also beneficial for articulating technical and performance issues clearly, collaborating on investigations, and driving resolution while maintaining productive, professional relationships. Working with the company offers impactful work that directly shapes the future of the organization. You'll be part of an innovative environment that encourages trying new things and pushing the envelope in EdTech. The role provides freedom with a flexible work arrangement either remotely or in a hybrid setup from one of the company's offices in Cyprus or Poland. Health benefits include a health insurance package for hybrid mode and a health corner in the Cyprus office. AI solutions like Cursor, Claude Code, Chat GPT subscription, and other tools are part of the work environment. Competitive salary, flexible paid time off with 21 days of annual leave and 10 bank holidays, and a collaborative culture working alongside passionate professionals complete the benefits package. The company is excited to review your CV for this role.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

The main responsibilities for this role include working with customer segments and behavioral data to identify opportunities for enhancing approval rates, reducing friction, and managing risk across various user profiles and markets. To be successful in this position, you should have strong technical proficiency in analytics tools such as Tableau, SuperSet, as well as SQL/NoSQL databases. Additionally, advanced SQL skills are required for executing complex queries, optimizing performance, and ensuring transaction-safe data manipulation. Proficiency with AWS Athena for querying large datasets in S3 using cost-effective practices is also essential. A key aspect of this role involves having a business-oriented mindset to connect analytical work with broader business outcomes, understanding how data influences approval rates, revenue, and customer experience. Collaboration and cross-functionality are crucial, as you will be working across teams such as Marketing, Monetization, Engineering, Finance, and Customer Care to foster shared understanding and solve problems collectively. It would be beneficial to have experience with Online Subscription Business Models, including knowledge of recurring revenue dynamics, churn management, and retention strategies. Understanding key metrics such as LTV and MRR is important for driving sustainable subscription growth. Additionally, having strong external communication skills to articulate technical issues, collaborate on investigations, and maintain professional relationships is a plus. Working in this role offers the opportunity to make a direct impact on the future of the company. The environment is innovative, encouraging employees to explore new ideas and push the boundaries in EdTech. The role provides flexibility, allowing for remote or hybrid work options from offices in Cyprus or Poland. Health benefits include a health insurance package for hybrid mode and a health corner in the Cyprus office. AI solutions like Cursor/Claude Code/Chat GPT subscription and other tools are available. A competitive salary, flexible paid time off, collaborative culture, and a team of passionate professionals await you. If you meet these qualifications and are interested in this position, we look forward to receiving your CV.,

Posted 1 month ago

Apply

13.0 - 17.0 years

0 Lacs

haryana

On-site

You will be joining our team as a Technical Support Analyst (Data-Focused), working remotely to support our growing Analytics/tech-support team. In this hybrid role, you will be responsible for bridging data analysis, technical investigation, and hands-on coding to resolve data issues, maintain script-based tools, and provide technical insights to our internal stakeholders. The ideal candidate should possess a strong analytical mindset, excellent Python and SQL skills, experience with Athena or large-scale datasets, and a keen attention to detail. Your primary responsibilities will include investigating and resolving data inconsistencies or anomalies reported by business or engineering teams, writing and debugging Python scripts for automation and data analysis, running SQL queries for data investigations, analyzing logs and data flows to troubleshoot technical issues, building and supporting internal tools for monitoring and reporting, and collaborating with product, data, and engineering teams to ensure data quality and integrity. Additionally, you will be expected to maintain detailed documentation of your findings, tools, and processes. To excel in this role, you should have strong Python skills for scripting and basic data processing, proficiency in SQL including joins and analytical queries, hands-on experience with AWS Athena or similar distributed query engines, general coding/debugging experience, familiarity with CLI/terminal environments and structured/unstructured data, a proven attention to detail, strong analytical and problem-solving skills, and technical curiosity. Preferred skills that would be a bonus include experience with AWS tools such as S3, Glue, Lambda, and CloudWatch, familiarity with Git and basic version control workflows, understanding of ETL pipelines, data validation, or schema management, exposure to dashboards like Metabase, Looker, or Tableau, ability to work with REST APIs for debugging or automation, and experience in cross-functional, collaborative teams. In terms of education and experience, a Bachelor's degree in a technical field (Computer Science, Engineering, Data Science) or equivalent practical experience is required, along with at least 3 years of experience in a technical analyst, data support, or operational engineering role. We are looking for a candidate with strong communication skills to explain findings to both technical and non-technical audiences, a team-first attitude willing to help others and adapt to changing priorities, a proactive, detail-oriented, and solution-driven approach, and the ability to multitask between support tickets, tooling, and analysis. Joining our team means being part of Affle, a global technology company focused on delivering consumer recommendations and conversions through relevant mobile advertising. Affle aims to enhance returns on marketing investment by reducing digital ad fraud and powering unique consumer journeys for marketers. Affle Holdings is the Singapore-based promoter for Affle 3i Limited, with investors including Microsoft and Bennett Coleman & Company (BCCL). If you are interested in being part of a dynamic team and making a difference in the mobile advertising industry, please visit our website at www.affle.com for more information.,

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III - Java Full Stack Developer + React + AWS at JPMorgan Chase within the Commercial & Investment Bank team, you'll serve as a member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You'll be responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. Execute software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems. Create secure and high-quality production code and maintain algorithms that run synchronously with appropriate systems. Create architectural and design documentation for complex applications, ensuring that the software code development adheres to the specified design constraints. Gather, analyze, synthesize, and develop visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. Proactively identify hidden problems and patterns in data and use these insights to drive improvements to coding hygiene and system architecture. Contribute to software engineering communities of practice and events that explore new and emerging technologies. Adds to the team culture of diversity, equality, inclusion, and respect. Required qualifications, capabilities, and skills include formal training or certification on software engineering concepts and 3+ years of proficient applied experience. Hands-on practical experience in system design, application development, testing, and operational stability. Strong experience in Java latest versions, Spring Boot and Spring Framework, JDBC, JUnit. Experience in RDBMS and NOSQL databases. Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.). Proficiency in Java/J2EE and REST APIs, Web Services, and experience in building event-driven Micro Services. Experience in developing UI applications using React and Angular. Working proficiency in developmental toolsets like GIT/Bit Bucket, JIRA, Maven. Proficiency in automation and continuous delivery methods. Strong analytical skills and problem-solving ability. Working knowledge of AWS & certification is a must. Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security. Design, deploy, and manage AWS cloud infrastructure using services such as EC2, S3, RDS, Kubernetes, Terraform, Lambda, and VPC. Working knowledge of AWS Glue, AWS Athena & AWS S3. Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages. Hands-on practical experience delivering system design, application development, testing, and operational stability. Collaborate with development teams to create scalable, reliable, and secure cloud architectures. Preferred qualifications, capabilities, and skills include exposure to the latest Python Libraries. Knowledge with AWS Lake Formation. Familiarity with modern front-end technologies. Experience in big data technologies: Hadoop. Experience on Caching Solutions: Hazelcast and Redis.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

kolkata, west bengal

On-site

You must have knowledge in Azure Datalake, Azure function, Azure Databricks, Azure Data Factory, and PostgreSQL. Working knowledge in Azure DevOps and Git flow would be an added advantage. Alternatively, you should have working knowledge in AWS Kinesis, AWS EMR, AWS Glue, AWS RDS, AWS Athena, and AWS RedShift. Demonstrable expertise in working with timeseries data is essential. Experience in delivering data engineering/data science projects in Industry 4.0 is an added advantage. Knowledge of Palantir is required. You must possess strong problem-solving skills with a focus on sustainable and reusable development. Proficiency in using statistical computer languages like Python/PySpark, Pandas, Numpy, seaborn/matplotlib is necessary. Knowledge in Streamlit.io is a plus. Familiarity with Scala, GoLang, Java, and big data tools such as Hadoop, Spark, Kafka is beneficial. Experience with relational databases like Microsoft SQL Server, MySQL, PostGreSQL, Oracle, and NoSQL databases including Hadoop, Cassandra, MongoDB is expected. Proficiency in data pipeline and workflow management tools like Azkaban, Luigi, Airflow is required. Experience in building and optimizing big data pipelines, architectures, and data sets is crucial. You should possess strong analytical skills related to working with unstructured datasets. Provide innovative solutions to data engineering problems, document technology choices, and integration patterns. Apply best practices for project delivery with clean code. Demonstrate innovation and proactiveness in meeting project requirements. Reporting to: Director- Intelligent Insights and Data Strategy Travel: Must be willing to be deployed at client locations worldwide for long and short terms, flexible for shorter durations within India and abroad.,

Posted 1 month ago

Apply

9.0 - 13.0 years

0 Lacs

haryana

On-site

You will be joining Research Partnership (part of Inizio Advisory) as a Dashboard Developer - Senior Manager based in Gurgaon, India. Research Partnership is a leading pharma market research and consulting agency with a global presence across London, Lyon, New York, Philadelphia, San Francisco, Singapore, and Delhi. The company's work focuses on making a difference to human health, celebrating progress through innovation, and putting people at the core of all activities. As part of the Data Delivery & Dashboards Team within the Data Management & Delivery division, your primary responsibility will involve leading the design, development, and delivery of impactful dashboards and visualizations. You will be leading a team of dashboard developers, ensuring alignment with stakeholder expectations, and driving innovation in dashboard solutions. Collaboration with researchers, analysts, and business leaders globally will be essential to ensure that the visual outputs provide clarity, impact, and value. Your key responsibilities will include developing interactive dashboards using tools such as PowerBI or Tableau, managing and mentoring a team of dashboard developers, translating complex project requirements into scalable dashboards, collaborating with internal stakeholders to align outputs with business needs, ensuring data accuracy and security, and staying updated on BI and visualization trends to implement improvements. To excel in this role, you should have extensive experience in BI/dashboard development and data engineering, along with significant experience in people management and team leadership. Strong engagement with senior stakeholders, a track record of delivering enterprise-grade dashboards, and a background in healthcare or market research are highly desirable qualifications. The ideal candidate for this position is a visionary thinker who can lead and inspire dashboard teams, possesses excellent communication and stakeholder management skills, has a deep understanding of data storytelling and visual best practices, and is a hands-on leader capable of driving innovation while ensuring delivery excellence. Research Partnership offers a dynamic and supportive work environment that encourages continuous learning and innovation. The company provides comprehensive training and development opportunities for all employees, including international travel and collaboration, within a relaxed and friendly setting. Research Partnership is part of Inizio Advisory, a strategic advisor to pharmaceutical and life science companies, offering market research, insights, strategy, consulting, and commercial benchmarking services. Inizio Advisory aims to support clients at every stage of the product and patient journey, creating long-term value through sector-specific solutions and intelligence. If you are passionate about the role but do not meet every job requirement, you are encouraged to apply as Research Partnership values diversity, inclusion, and authenticity in the workplace. Your unique experience and perspective may be the perfect fit for this role or other opportunities within the company.,

Posted 2 months ago

Apply

7.0 - 11.0 years

0 Lacs

haryana

On-site

As a Dashboard Developer - Manager at Research Partnership in Gurgaon, India, you will be responsible for designing, building, and maintaining high-impact dashboards and data visualizations that translate raw market research data into actionable insights. Collaborating with researchers, analysts, and engineers, you will ensure seamless data flow and visual storytelling. Your primary role will involve developing and maintaining interactive dashboards using Power BI, Tableau, or similar BI tools. You will need to translate project requirements into intuitive visual stories, collaborate with scripting and data processing teams to streamline workflows, ensure data accuracy and security adherence, automate reporting processes, and stay updated on BI trends. For this role, you should have at least 6 years of hands-on experience in BI/dashboard development and a proven track record across the data to dashboard lifecycle. A background in healthcare or market research is preferred. Technical expertise required includes backend development skills in PHP (6+ years) with frameworks like Laravel or CodeIgniter, REST & SOAP API design, proficiency in databases like PostgreSQL, MySQL, MS SQL, and experience with Bigdata engines such as Google Big Query and AWS Athena. Frontend/visualization skills in HTML, CSS, JavaScript, React, Vue.js, jQuery, and visual libraries like Chart.js, D3.js, High Charts, Google Charts are necessary. Experience with cloud deployment (AWS & Google Cloud), containers (Docker, Vagrant, VirtualBox), CI/CD pipelines (Jenkins, CircleCI, GitHub Actions), caching technologies (Redis, Memcached), and security protocols is also essential. You should be familiar with data access control, role-based permissions, PHP unit testing, version control, and Agile collaboration tools. The ideal candidate for this role is a detail-oriented visual storyteller with a problem-solving mindset, strong communication skills, and a collaborative approach to work. Research Partnership offers a supportive environment with comprehensive training programs, international travel opportunities, and a relaxed working atmosphere. Inizio Advisory, of which Research Partnership is a part, is dedicated to providing strategic support to pharmaceutical and life science companies, helping them navigate the product and patient journey. The company values diversity, inclusivity, and authenticity in the workplace, encouraging candidates to apply even if they do not meet all qualifications.,

Posted 2 months ago

Apply

9.0 - 23.0 years

0 Lacs

haryana

On-site

You will be joining Research Partnership, a leading pharma market research and consulting agency, as a Dashboard Developer - Senior Manager based in Gurgaon, India. As part of the Data Delivery & Dashboards Team, your main responsibility will be to oversee the design, development, and delivery of impactful dashboards and visualizations using tools like Power BI and Tableau. You will also lead a team of developers, ensuring alignment with stakeholder expectations and driving innovation in dashboard solutions. Your role will involve translating complex project requirements into user-friendly dashboards, collaborating with internal stakeholders to meet business needs, and maintaining high standards of data accuracy, security, and responsiveness. You will stay updated on BI and visualization trends to implement improvements proactively. To excel in this role, you should have at least 10 years of experience in BI/dashboard development and data engineering, with a background in healthcare or market research being advantageous. Strong leadership and team management skills are essential, along with the ability to engage with senior stakeholders globally. You should be a visionary thinker, an excellent communicator, and have a deep understanding of data storytelling and user experience. Your technical expertise should cover backend development using PHP and frameworks like Laravel, frontend technologies including HTML, CSS, and JavaScript, database proficiency in PostgreSQL and MySQL, cloud deployment skills with AWS and Google Cloud, and experience with CI/CD tools like Jenkins and GitHub Actions. Knowledge of caching mechanisms, security protocols, and agile collaboration tools is also required. At Research Partnership, you will work in a dynamic and innovative environment that encourages continuous learning and offers opportunities for international travel and collaboration. The company values diversity and inclusivity, so even if you don't meet every job requirement, your enthusiasm and potential may make you the right fit for the role. Apply now to be a part of a team that values creativity, growth, and excellence in delivering market research solutions.,

Posted 2 months ago

Apply

2.0 - 7.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Experience : 2+ years Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Hybrid (Bengaluru) Must have skills required: AWS, Go Lang, Python Requirements : We are looking for a Backend Engineer to help us through the next level of technology changes needed to revolutionize Healthcare for India. We are seeking individuals who can understand real-world scenarios and come up with scalable tech solutions for millions of patients to make healthcare accessible. The role comes with a good set of challenges to solve, and offers an opportunity to build new systems that will be rolled out at scale. You have 2 to 4 years or more of software development experience with expertise in designing and implementing high-performance web applications. Very strong understanding and experience with any of Java, Scala, GoLang, Python. Experience writing optimized queries in relational databases like Mysql, redshift / Postgres. You have exposure to basic data engineering concepts like data pipeline, hadoop or spark Write clean and testable code. You love to build platforms that enable other teams to build on top of. Some of challenges we solve include: Clinical decision support Early Detection: Digitally assist doctors in identifying high-risk patients for early intervention Track & Advice: Analyze patients vitals/test values across visits to assist doctors in personalizing chronic care. Risk Prevention: Assist doctors in monitoring the progression of chronic disease by drawing attention to additional symptoms and side effects. EMR (Electronic Medical Records): Clinical software to write prescriptions and manage clinical records AI-powered features Adapts to doctors practice: Learns from doctors prescribing preferences and provides relevant auto-fill recommendations for faster prescriptions. Longitudinal patient journey: AI analyses the longitudinal journey of patients to assist doctors in early detection. Medical language processing: AI-driven automatic digitization of printed prescriptions and test reports. Core platform Pharma advertising platform to doctors at the moment of truth Real world evidence to generate market insights for B2B consumption Virtual Store Online Pharmacy+ Diagnostic solutions helping patients with one-click order Technologies we use : Distributed Tech: Kafka, Elastic search Databases: MongoDB, RDS Cloud platform: AWS Languages: Go-lang, python, PHP UI Tech: React, react native Caching: Redis Big Data: AWS Athena, Redshift APM: NewRelic Responsibilities : Develop well testable and reusable services with structured, granular and well-commented code. Contribute in the area of API building, data pipeline setup, and new tech initiatives needed for a core platform Acclimate to new technologies and situations as per the company demands and requirements with the vision of providing best customer experience Meet expected deliverables and quality standards with every release Collaborate with teams to design, develop, test and refine deliverables that meet the objectives Perform code reviews and implement improvement plans Additional Responsibilities : Pitch-in during the phases of design and architectural solutions of Business problems. Organize, lead and motivate the development team to meet expected timelines and quality standards across releases. Actively contribute to development process improvement plans. Assist peers by code reviews and juniors through mentoring. Must have Skills : Sound understanding of Computer Science fundamentals including Data Structures and Space and Time complexity. Excellent problem solving skills Solid understanding of any of the modern Object oriented programming languages (like Java, Ruby or Python) and or Functional languages (like Scala, GoLang) Understanding of MPP (Massive parallel processing) and frameworks like Spark Experience working with Databases (RDBMS - Mysql, Redshift etc, NoSQL - Couchbase / MongoDB / Cassandra etc). Experience working with open source libraries and frameworks. Strong hold on versioning tools Git/Bitbucket. Good to have Skills : Knowledge of MicroServices architecture. Have experience working with Kafka Experience or Exposure to ORM frameworks (like ActiveRecord, SQLAlchemy etc). Working knowledge of full text search (like ElasticSearch, Solr etc). Skills AWS, Go Lang, Python

Posted 2 months ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Job type Contract to HIRE Job Description : Tableau Administration & Infrastructure Management Install, configure, and maintain Tableau Server across multi-node environments. Manage Tableau Cloud (Online) and its integrations with enterprise systems. Monitor server activity and usage statistics to identify performance enhancements. Troubleshoot and resolve Tableau Server issues using logs, Tableau Repository, and monitoring tools. Manage Tableau Server upgrades and patching to ensure system stability and security Performance Optimization & Data Integrations Identify and resolve Tableau dashboard performance issues, optimizing extract refreshes and queries. Integrate Tableau with diverse cloud (GCP, AWS Athena) and on-premises data sources. Work with development teams to ensure new features and integrations align with infrastructure capabilities. Security, Authentication & Access Management Configure and manage SSO, Azure AD authentication, SCIM provisioning, and MFA. Implement and enforce role-based access control (RBAC) and security policies. User Support, Training & Governance Provide training and support to end-users on Tableau functionality and best practices. Create and manage custom administrative views using Tableau Repository data for user activity, license management, and monitoring. Collaborate with stakeholders to ensure Tableau governance and best practices are followed. Vendor & Stakeholder Collaboration Work closely with Tableau Vendor for complex issue resolution and enhancements. Coordinate with business users, developers, and IT teams to ensure a smooth Tableau experience If you are interested in, please share the update profile with below details. Current CTC Expected CTC Notice Period Total Experience Relevant Experience

Posted 3 months ago

Apply

6.0 - 11.0 years

25 - 30 Lacs

Bengaluru

Hybrid

Mandatory Skills : Data engineer , AWS Athena, AWS Glue,Redshift,Datalake,Lakehouse,Python,SQL Server Must Have Experience: 6+ years of hands-on data engineering experience Expertise with AWS services: S3, Redshift, EMR, Glue, Kinesis, DynamoDB Building batch and real-time data pipelines Python, SQL coding for data processing and analysis Data modeling experience using Cloud based data platforms like Redshift, Snowflake, Databricks Design and Develop ETL frameworks Nice-to-Have Experience : ETL development using tools like Informatica, Talend, Fivetran Creating reusable data sources and dashboards for self-service analytics Experience using Databricks for Spark workloads or Snowflake Working knowledge of Big Data Processing CI/CD setup Infrastructure-as-code implementation Any one of the AWS Professional Certification

Posted 3 months ago

Apply

10.0 - 16.0 years

60 - 75 Lacs

Pune

Hybrid

Position Summary: As a Software Architect, you will be responsible for providing technical leadership and architectural guidance to development teams, ensuring the design and implementation of scalable, robust, and maintainable software solutions. You will collaborate with stakeholders, including business leaders, project managers, and developers, to understand requirements, define architectural goals, and make informed decisions on technology selection, system design, and implementation strategies. Additionally, you will mentor and coach team members, promote best practices, and foster a culture of innovation and excellence within the organization. This role is based in Redaptive Pune, India office. Responsibilities and Duties: Time Spent Performing Duty: System Design and Architecture : 40% Identify and propose technical solutions for complex problem-statements. Provides an application-level perspective during design and implementation, which incorporates for cost constraints, testability, complexity, scalability, performance, migrations, etc. Provide technical leadership and guidance to development teams, mentoring engineers and fostering a culture of excellence and innovation. Review code and architectural designs to ensure adherence to coding standards, best practices, and architectural principles. Create and maintain architectural documentation, including architectural diagrams, design documents, and technical specifications, to ensure clarity and facilitate collaboration. Software Design and Development: 50% Gather and analyze requirements from stakeholders, understanding business needs, and translating them into technical specifications. Work alongside teams at all stages of design & development. Augmenting and supporting teams as needed. Collaborate with product managers, stakeholders, and cross-functional teams to define project scope, requirements, and timelines, and ensure successful project execution. Knowledge Sharing and Continuous Improvement: 10% Conduct presentations, workshops, and training sessions to educate stakeholders and development teams on architectural concepts, best practices, and technologies. Stay updated with emerging technologies, industry trends, and best practices in software architecture and development. Identify opportunities for process improvement, automation, and optimization in software development processes and methodologies. Share knowledge and expertise with team members through mentorship, training sessions, and community involvement. Required Abilities and Skills: Strong analytical and troubleshooting skills. Excellent verbal and written communication skills. Ability to effectively communicate with stakeholders, including business leaders and project managers to understand requirements and constraints. Works effectively with cross-functional teams, including developers, QA, product managers, and operations. Capability to understand the bigger picture and design systems that align with business goals, scalability requirements, and future growth. Ability to make tough decisions and take ownership of architectural choices, considering both short-term and long-term implications Mastery of one or more programming languages commonly used in software development, such as Java, Python, or JavaScript. Expertise in SQL and NoSQL database, including database design and optimization. Ability to quickly learn new technologies and adapt to changing requirements. Knowledge of techniques for designing scalable and high-performance web services, including load balancing, caching, and horizontal scaling. Knowledge of software design principles (e.g. object-oriented principles, data structures, and algorithms.) Processes a security mindset, drives adoption of best practices to design systems that are secure and resilient to security threats. Continuously learning and staying up to date with emerging technologies and best practices. Domain knowledge in energy efficiency, solar/storage, or electric utilities is a plus. Education and Experience: 10+ years of software development experience. Proven track record of delivering high-quality software solutions within deadlines. Demonstrated technical leadership experience. Experience with data heavy systems like Databricks and Data Ops. Experience with Cloud (AWS) application development. Experience with Java & Spring framework strongly preferred. Experience with distributed architectures, SOA, microservices and containerization technologies (e.g., Docker, Kubernetes) Experience designing and developing web-based applications and backend services. Travel: This role may require 1-2 annual international work visits to the US. The Perks! Equity plan participation Medical and Personal Accident Insurance Support on Hybrid working and Relocation Flexible Time Off Continuous Learning Annual bonus, subject to company and individual performance The company is an Equal Opportunity Employer, drug free workplace, and complies with Labor Laws as applicable. All duties and responsibilities are essential functions and requirements and are subject to possible modification to reasonably accommodate individuals with disabilities. The requirements listed in this document are the minimum levels of knowledge, skills, or abilities.

Posted 3 months ago

Apply

8 - 13 years

12 - 22 Lacs

Gurugram

Work from Office

Data & Information Architecture Lead 8 to 15 years - Gurgaon Summary An Excellent opportunity for Data Architect professionals with expertise in Data Engineering, Analytics, AWS and Database. Location Gurgaon Your Future Employer : A leading financial services provider specializing in delivering innovative and tailored solutions to meet the diverse needs of our clients and offer a wide range of services, including investment management, risk analysis, and financial consulting. Responsibilities Design and optimize architecture of end-to-end data fabric inclusive of data lake, data stores and EDW in alignment with EA guidelines and standards for cataloging and maintaining data repositories Undertake detailed analysis of the information management requirements across all systems, platforms & applications to guide the development of info. management standards Lead the design of the information architecture, across multiple data types working closely with various business partners/consumers, MIS team, AI/ML team and other departments to design, deliver and govern future proof data assets and solutions Design and ensure delivery excellence for a) large & complex data transformation programs, b) small and nimble data initiatives to realize quick gains, c) work with OEMs and Partners to bring the best tools and delivery methods. Drive data domain modeling, data engineering and data resiliency design standards across the micro services and analytics application fabric for autonomy, agility and scale Requirements Deep understanding of the data and information architecture discipline, processes, concepts and best practices Hands on expertise in building and implementing data architecture for large enterprises Proven architecture modelling skills, strong analytics and reporting experience Strong Data Design, management and maintenance experience Strong experience on data modelling tools Extensive experience in areas of cloud native lake technologies e.g. AWS Native Lake Solution onsibilities

Posted 4 months ago

Apply

0.0 - 2.0 years

3 - 5 Lacs

gurugram

Work from Office

Role Overview: The Business Analyst / Data Analyst will work closely with Sales, Marketing, Operations, Product, and Customer Support teams to deliver insights that improve performance and efficiency. Youll be responsible for collecting, analyzing, and reporting business data, building dashboards, and helping teams make data-backed decisions. This is a hands-on, execution-driven role where you’ll learn to connect business problems with analytical solutions, contribute to company growth, and develop your skills in SQL, BI tools, and analytics frameworks. Key Responsibilities: Build and maintain dashboards, reports, and data models to track business performance and enable decision-making. Collect, clean, and analyze data to identify trends, anomalies, and opportunities for growth or efficiency. Conduct customer and revenue analyses (e.g., churn, retention) and provide actionable insights. Support operational and financial analysis by tracking shipment performance, cost structures, and margins. Write and optimize SQL queries, ensure data accuracy, and contribute to automation and BI improvements. Collaborate with cross-functional teams (Sales, Marketing, Operations, Finance) to translate business needs into data-driven insights. Who You Are: 0–3 years of experience in business analytics, data analysis, or BI roles. Strong skills in SQL, Python, AWS Athena, and S3 (must-have), with familiarity in R for analysis as a plus.. Hands-on experience with BI tools like Amazon QuickSigh t, Power BI, or Google analytics is a plus. Good understanding of metrics such as revenue KPIs, customer behavior, and operational performance. Analytical mindset with strong attention to detail and problem-solving ability. Eager to learn , highly collaborative, and comfortable working in a fast-paced environment. Strong communication skills — able to present data simply and clearly. Why Join Us? Impact: Work on real business problems in a high-growth logistics-tech company. Learning: Gain end-to-end exposure to analytics across Sales, Marketing, Operations, and Finance. Growth: Fast career progression, mentorship from experienced leaders, and hands-on training. Culture: A high-energy, performance-driven, and data-first workplace. Perks: Competitive salary, learning opportunities, and growth incentives.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies