Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 years
0 Lacs
India
On-site
We are seeking a highly skilled and experienced AWS Architect with a strong background in Data Engineering and expertise in Generative AI. In this pivotal role, you will be responsible for designing, building, and optimizing scalable, secure, and cost-effective data solutions that leverage the power of AWS services, with a particular focus on integrating and managing Generative AI capabilities. The ideal candidate will possess a deep understanding of data architecture principles, big data technologies, and the latest advancements in Generative AI, including Large Language Models (LLMs) and Retrieval Augmented Generation (RAG). You will work closely with data scientists, machine learning engineers, and business stakeholders to translate complex requirements into robust and innovative solutions on the AWS platform. Responsibilities: • Architect and Design: Lead the design and architecture of end-to-end data platforms and pipelines on AWS, incorporating best practices for scalability, reliability, security, and cost optimization. • Generative AI Integration: Architect and implement Generative AI solutions using AWS services like Amazon Bedrock, Amazon SageMaker, Amazon Q, and other relevant technologies. This includes designing RAG architectures, prompt engineering strategies, and fine-tuning models with proprietary data (knowledge base). • Data Engineering Expertise: Design, build, and optimize ETL/ELT processes for large-scale data ingestion, transformation, and storage using AWS services such as AWS Glue, Amazon S3, Amazon Redshift, Amazon Athena, Amazon EKS and Amazon EMR. • Data Analytics: Design, build, and optimize analytical solutions for large-scale data ingestion, analytics and insights using AWS services such as AWS Quicksight • Data Governance and Security: Implement robust data governance, data quality, and security measures, ensuring compliance with relevant regulations and industry best practices for both traditional data and Generative AI applications. • Performance Optimization: Identify and resolve performance bottlenecks in data pipelines and Generative AI workloads, ensuring efficient resource utilization and optimal response times. • Technical Leadership: Act as a subject matter expert and provide technical guidance to data engineers, data scientists, and other team members. Mentor and educate on AWS data and Generative AI best practices. • Collaboration: Work closely with cross-functional teams, including product owners, data scientists, and business analysts, to understand requirements and deliver impactful solutions. • Innovation and Research: Stay up-to-date with the latest AWS services, data engineering trends, and advancements in Generative AI, evaluating and recommending new technologies to enhance our capabilities. • Documentation: Create comprehensive technical documentation, including architectural diagrams, design specifications, and operational procedures. • Cost Management: Monitor and optimize AWS infrastructure costs related to data and Generative AI workloads. Required Skills and Qualifications: • 12+ years of experience in data engineering, data warehousing, or big data architecture. • 5+ years of experience in an AWS Architect role, specifically with a focus on data. • Proven experience designing and implementing scalable data solutions on AWS. • Strong hands-on experience with core AWS data services, including: o Data Storage: Amazon S3, Amazon Redshift, Amazon DynamoDB, Amazon RDS o Data Processing: AWS Glue, Amazon EMR, Amazon EKS, AWS Lambda, Informatica o Data Analytic: Amazon Quicksight, Amazon Athena, Tableau o Data Streaming: Amazon Kinesis, AWS MSK o Data Lake: AWS Lake Formation • Strong competencies in Generative AI, including: o Experience with Large Language Models (LLMs) and Foundation Models (FMs). o Hands-on experience with Amazon Bedrock (including model customization, agents, and orchestrations). o Understanding and experience with Retrieval Augmented Generation (RAG) architectures and vector databases (e.g., Amazon OpenSearch Service for vector indexing). o Experience with prompt engineering and optimizing model responses. o Familiarity with Amazon SageMaker for building, training, and deploying custom ML/Generative AI models. o Knowledge of Amazon Q for business-specific Generative AI applications. • Proficiency in programming languages such as Python (essential), SQL, and potentially Scala or Java. • Experience with MLOps/GenAIOps principles and tools for deploying and managing Generative AI models in production. • Solid understanding of data modeling, data warehousing concepts, and data lake architectures. • Experience with CI/CD pipelines and DevOps practices on AWS. • Excellent communication, interpersonal, and presentation skills, with the ability to articulate complex technical concepts to both technical and non-technical audiences. • Strong problem-solving and analytical abilities. Preferred Qualifications: • AWS Certified Solutions Architect – Professional or AWS Certified Data Engineer – Associate/Specialty. • Experience with other Generative AI frameworks (e.g., LangChain) or open-source LLMs. • Familiarity with containerization technologies like Docker and Kubernetes (Amazon EKS). • Experience with data transformation tools like Informatica, Matillion • Experience with data visualization tools (e.g., Amazon QuickSight, Tableau, Power BI). • Knowledge of data governance tools like Amazon DataZone. • Experience in a highly regulated industry (e.g., Financial Services, Healthcare).
Posted 2 days ago
10.0 years
0 Lacs
India
Remote
Job description #hiring #Senior Backend Developer Min Experience: 10+ Years Location: Remote We are seeking a highly experienced Technical Lead with over 10 years of experience, including at least 2 years in a leadership role, to guide and mentor a dynamic engineering team. This role is critical to designing, developing, and optimizing high-performance, scalable, and reliable backend systems. The ideal candidate will have deep expertise in Python (Flask), AWS (Lambda, Redshift, Glue, S3), Microservices, and Database Optimization (SQL, RDBMS). We operate in a high-performance environment, comparable to leading product companies, where uptime, defect reduction, and data clarity are paramount. As a Technical Lead, you will ensure engineering excellence, maintain high-quality standards, and drive innovation in software architecture and development. Key Responsibilities: · Own backend architecture and lead the development of scalable, efficient web applications and microservices. · Ensure production-grade AWS deployment and maintenance with high availability, cost optimization, and security best practices. · Design and optimize databases (RDBMS, SQL) for performance, scalability, and reliability. · Lead API and microservices development, ensuring seamless integration, scalability, and maintainability. · Implement high-performance solutions, emphasizing low latency, uptime, and data accuracy. · Mentor and guide developers, fostering a culture of collaboration, disciplined coding, and technical excellence. · Conduct technical reviews, enforce best coding practices, and ensure adherence to security and compliance standards. · Drive automation and CI/CD pipelines to enhance deployment efficiency and reduce operational overhead. · Communicate technical concepts effectively to technical and non-technical stakeholders. · Provide accurate work estimations and align development efforts with broader business objectives. Key Skills: Programming: Strong expertise in Python (Flask) and Celery. AWS: Core experience with Lambda, Redshift, Glue, S3, and production-level deployment strategies. Microservices & API Development: Deep understanding of architecture, service discovery, API gateway design, observability, and distributed systems best practices. Database Optimization: Expertise in SQL, PostgreSQL, Amazon Aurora RDS, and performance tuning. CI/CD & Infrastructure: Experience with GitHub Actions, GitLab CI/CD, Docker, Kubernetes, Terraform, and CloudFormation. Monitoring & Logging: Familiarity with AWS CloudWatch, ELK Stack, and Prometheus. Security & Compliance: Knowledge of backend security best practices and performance optimization. Collaboration & Communication: Ability to articulate complex technical concepts to international stakeholders and work seamlessly in Agile/Scrum environments. 📩 Apply now or refer someone great. Please share your updated resume to hr.team@kpitechservices.com #PythonJob #jobs #BackendDeveloper
Posted 2 days ago
10.0 - 14.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
What Gramener offers you Gramener will offer you an inviting workplace, talented colleagues from diverse backgrounds, career path, steady growth prospects with great scope to innovate. Our goal is to create an ecosystem of easily configurable data applications focused on storytelling for public and private use Cloud Lead – Analytics & Data Products We’re looking for a Cloud Architect/Lead to design, build, and manage scalable AWS infrastructure that powers our analytics and data product initiatives. This role focuses on automating infrastructure provisioning, application/API hosting, and enabling data and GenAI workloads through a modern, secure cloud environment. Roles and Responsibilities Design and provision AWS infrastructure using Terraform or AWS CloudFormation to support evolving data product needs. Develop and manage CI/CD pipelines using Jenkins, AWS CodePipeline, CodeBuild, or GitHub Actions. Deploy and host internal tools, APIs, and applications using ECS, EKS, Lambda, API Gateway, and ELB. Provision and support analytics and data platforms using S3, Glue, Redshift, Athena, Lake Formation, and orchestration tools like Step Functions or Apache Airflow (MWAA). Implement cloud security, networking, and compliance using IAM, VPC, KMS, CloudWatch, CloudTrail, and AWS Config. Collaborate with data engineers, ML engineers, and analytics teams to align infrastructure with application and data product requirements. Support GenAI infrastructure, including Amazon Bedrock, SageMaker, or integrations with APIs like OpenAI. Skills and Qualifications: 10-14 years of experience in cloud engineering, DevOps, or cloud architecture roles. Hands-on expertise with the AWS ecosystem and tools listed above. Proficiency in scripting (e.g., Python, Bash) and infrastructure automation. Experience deploying containerized workloads using Docker, ECS, EKS, or Fargate. Familiarity with data engineering and GenAI workflows is a plus. AWS certifications (e.g., Solutions Architect, DevOps Engineer) are preferred.
Posted 2 days ago
10.0 years
0 Lacs
Delhi, India
On-site
YOE : 10 YEARS TO 15 YEARS SKILLS REQUIRED : Java, python, HDFS, YARN, Map-Reduce, Hive, Kafka, Spark, Airflow, Presto, HLD, LLD, SQL, NOSQL, MongoDB, etc PREFERENCE : Tier 1 college/universities Role & Responsibilities Lead and mentor a team of data engineers, ensuring high performance and career growth. Architect and optimize scalable data infrastructure, ensuring high availability and reliability. Drive the development and implementation of data governance frameworks and best practices. Work closely with cross-functional teams to define and execute a data roadmap. Optimize data processing workflows for performance and cost efficiency. Ensure data security, compliance, and quality across all data platforms. Foster a culture of innovation and technical excellence within the data team. Ideal Candidate Candidates from TIER 1 college preferred MUST have Experience in Product startups , and should have implemented Data Engineering systems from an early stage in the Company MUST have 10+ years of experience in software/data engineering, with at least 3+ years in a leadership role. MUST have Expertise in backend development with programming languages such as Java, PHP, Python, Node.JS, GoLang, JavaScript, HTML, and CSS. MUST be Proficiency in SQL, Python, and Scala for data processing and analytics. Strong understanding of cloud platforms (AWS, GCP, or Azure) and their data services. MUST have Strong foundation and expertise in HLD and LLD, as well as design patterns, preferably using Spring Boot or Google Guice MUST have Experience in big data technologies such as Spark, Hadoop, Kafka, and distributed computing frameworks. Hands-on experience with data warehousing solutions such as Snowflake, Redshift, or BigQuery Deep knowledge of data governance, security, and compliance (GDPR, SOC2, etc.). Experience in NoSQL databases like Redis, Cassandra, MongoDB, and TiDB. Familiarity with automation and DevOps tools like Jenkins, Ansible, Docker, Kubernetes, Chef, Grafana, and ELK. Proven ability to drive technical strategy and align it with business objectives. Strong leadership, communication, and stakeholder management skills. Candidates from TIER 1 college preferred Preferred Qualifications: Experience in machine learning infrastructure or MLOps is a plus. Exposure to real-time data processing and analytics. Interest in data structures, algorithm analysis and design, multicore programming, and scalable architecture. Prior experience in a SaaS or high-growth tech company.
Posted 2 days ago
6.0 - 8.0 years
0 Lacs
Greater Kolkata Area
On-site
At Cision, we believe in empowering every individual to make an impact. Here, your voice is heard, your ideas are valued, and your unique perspective fuels our collective success. As part of our global team, you'll thrive in an environment that champions curiosity, collaboration, and innovation, all while making meaningful contributions to the brands we accelerate. Join us in shaping the future of communication and building authentic connections that matter. Whether you're solving complex problems or driving bold innovations, your growth is our success, and together, we’ll create the conversations of tomorrow. Empower your impact at Cision. Be seen, be understood, be you. Job Summary The Manager, Business Intelligence position participates in the development of a data strategy to quickly cultivate a data-driven culture across the organization and to optimize our business performance by identifying growth opportunities and highlighting areas for improvement. The role will proactively communicate with stakeholders, team members and partners to support a high-performing team responsible for providing sales intelligence and data visualizations by leveraging business intelligence tools. Essential Duties And Responsibilities Coordinate and align priorities with the organization's strategic goals, partnering with business leadership to identify data and analytical needs via Value Approval Deliver the business intelligence strategy that combines data visualization to make profitable, data-driven decisions As a backbone to all things BI, establish and maintain high data integrity, quality, and governance standards. Craft data management practices are in place to support accurate and reliable data analysis. Develop dashboards that provide up to date information to sales leaders and sales associates on KPIs and other business objectives and goals. Translate intricate datasets into intuitive and insightful visualizations that drive databased decision-making across the organization. Distill insights from data and communicate recommendations to business customers Oversee the selection, implementation, and management of BI tools and technologies. Lead the creation and maintenance of reports, dashboards, and other data visualizations. Translating raw data into visual contexts that is easy for business customers to interpret Oversee BI projects/enhancements from inception to completion, ensuring they are delivered on time and within budget. Present data insights to stakeholders and business leaders clearly and in a relatable way Influence key decisions that would affect business decisions Maintain an accurate data portfolio that includes high quality dashboards and data models Mentor and upskill team members including data analysts, system admin and BI Developers Participate in the exploration and evaluation of emerging reporting tools, technologies, and methodologies to drive innovation and leverage best practices to advance the organization's BI capabilities. Minimum Required Qualifications Bachelor’s degree in computer science, Information Systems, Business Administration, or a related field. A master’s degree or an MBA can be advantageous. 6 - 8 years of experience in Data Management or Visualization 6- 8 years of experience in a high-functioning, fast-paced work environment with strong business acumen. 3-5 years of people leaders with high social intelligence Manage and mentor junior data analysts and BI developers. Proven expertise in executing data management, reporting & visualization in Domo. Secondarily Power BI and Tableau. Experience with Amazon Redshift and DBT desired. Proficient in Microsoft Office Suite Knowledge of complex data integration from multiple data sources Experience with Statistics and Probability Excellent verbal and written communication skills to translate complex data into easyto-understand, practical terms that every person can understand Deep understanding of data governance, compliance and privacy best practices Agility to changing priorities and situations High attention to detail and accuracy To be successful in this role, candidates must have demonstrated experience in organizing data in a way that allows business leaders to make informed decisions and reach their full potential by leveraging timely and accurate data. As a global leader in PR, marketing and social media management technology and intelligence, Cision helps brands and organizations to identify, connect and engage with customers and stakeholders to drive business results. PR Newswire, a network of over 1.1 billion influencers, in-depth monitoring, analytics and its Brandwatch and Falcon.io social media platforms headline a premier suite of solutions. Cision has offices in 24 countries throughout the Americas, EMEA and APAC. For more information about Cision's award-winning solutions, including its next-gen Cision Communications Cloud®, visit www.cision.com and follow @Cision on Twitter. Cision is committed to fostering an inclusive environment where all employees can be their authentic selves and perform at their best. We believe diversity, equity, and inclusion is vital to driving our culture, sparking innovation and achieving long-term success. Cision is proud to have joined more than 600 companies in signing the CEO Action for Diversity & Inclusion™ pledge and named a “Top Diversity Employer” for 2021 by DiversityJobs.com. Cision is proud to be an equal opportunity employer, seeking to create a welcoming and diverse environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, veteran status, or other protected statuses. Cision is committed to the full inclusion of all qualified individuals. In keeping with our commitment, Cision will take the steps to assure that people with disabilities are provided reasonable accommodations. Accordingly, if reasonable accommodation is required to fully participate in the job application or interview process, to perform the essential functions of the position, and/or to receive all other benefits and privileges of employment, please contact hr.support@cision.com Please review our Global Candidate Data Privacy Statement to learn about Cision’s commitment to protecting personal data collected during the hiring process.
Posted 2 days ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
MicroStrategy Senior Developer Experience - 7–10 years Description - 7-10 years of experience in designing and building BI solutions. Must have expertise in MicroStrategy Desktop/Web/Server, strong SQL and data modeling skills, and a working knowledge of AWS Redshift functions. Experience in dashboard/report development, data integration, and performance tuning is essential. Key Skills: MicroStrategy (Desktop, Web, Intelligence Server, Mobile) SQL, Data Modeling (Dimensional), Data Integration Report & Dashboard Development, Performance Optimization AWS Redshift (functions, integration) Strong analytical and communication skills Preferred: Experience with Power BI and ability to switch between tools MicroStrategy certifications
Posted 2 days ago
7.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
We’re looking for a Cloud Architect / Lead to design, build, and manage scalable AWS infrastructure that powers our analytics and data product initiatives. This role focuses on automating infrastructure provisioning, application/API hosting, and enabling data and GenAI workloads through a modern, secure cloud environment. Key Responsibilities Design and provision AWS infrastructure using Terraform or AWS CloudFormation to support evolving data product needs. Develop and manage CI/CD pipelines using Jenkins, AWS CodePipeline, CodeBuild, or GitHub Actions. Deploy and host internal tools, APIs, and applications using ECS, EKS, Lambda, API Gateway, and ELB. Provision and support analytics and data platforms using S3, Glue, Redshift, Athena, Lake Formation, and orchestration tools like Step Functions or Apache Airflow (MWAA). Implement cloud security, networking, and compliance using IAM, VPC, KMS, CloudWatch, CloudTrail, and AWS Config. Collaborate with data engineers, ML engineers, and analytics teams to align infrastructure with application and data product requirements. Support GenAI infrastructure, including Amazon Bedrock, SageMaker, or integrations with APIs like OpenAI. Requirements 7-10 years of experience in cloud engineering, DevOps, or cloud architecture roles. Strong hands-on expertise with the AWS ecosystem and tools listed above. Proficiency in scripting (e.g., Python, Bash) and infrastructure automation. Experience deploying containerized workloads using Docker, ECS, EKS, or Fargate. Familiarity with data engineering and GenAI workflows is a plus. AWS certifications (e.g., Solutions Architect, DevOps Engineer) are preferred.
Posted 2 days ago
6.0 years
0 Lacs
Udaipur, Rajasthan, India
On-site
Role: Senior Data Engineer Experience: 4-6 Yrs Location: Udaipur , Jaipur Job Description: We are looking for a highly skilled and experienced Data Engineer with 4–6 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python Collaborate with data analysts, data scientists, and product teams to understand data needs Optimize queries and data models for performance and reliability Integrate data from various sources, including APIs, internal databases, and third-party systems Monitor and troubleshoot data pipelines to ensure data quality and integrity Document processes, data flows, and system architecture Participate in code reviews and contribute to a culture of continuous improvement Required Skills: 4–6 years of experience in data engineering, data architecture, or backend development with a focus on data Strong command of SQL for data transformation and performance tuning Experience with Python (e.g., pandas, Spark, ADF) Solid understanding of ETL/ELT processes and data pipeline orchestration Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) Basic Programming Skills Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. Exposure to enterprise solutions (e.g., Databricks, Synapse) Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) Background in real-time data streaming and event-driven architectures Understanding of data governance, security, and compliance best practices Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm
Posted 2 days ago
6.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
JOB DESCRIPTION: DATA ENGINEER (Databricks & AWS) Overview: As a Data Engineer, you will work with multiple teams to deliver solutions on the AWS Cloud using core cloud data engineering tools such as Databricks on AWS, AWS Glue, Amazon Redshift, Athena, and other Big Data-related technologies. This role focuses on building the next generation of application-level data platforms and improving recent implementations. Hands-on experience with Apache Spark (PySpark, SparkSQL), Delta Lake, Iceberg, and Databricks is essential. Locations: Jaipur, Pune, Hyderabad, Bangalore, Noida. Responsibilities: • Define, design, develop, and test software components/applications using AWS-native data services: Databricks on AWS, AWS Glue, Amazon S3, Amazon Redshift, Athena, AWS Lambda, Secrets Manager • Build and maintain ETL/ELT pipelines for both batch and streaming data. • Work with structured and unstructured datasets at scale. • Apply Data Modeling principles and advanced SQL techniques. • Implement and manage pipelines using Apache Spark (PySpark, SparkSQL) and Delta Lake/Iceberg formats. • Collaborate with product teams to understand requirements and deliver optimized data solutions. • Utilize CI/CD pipelines with DBX and AWS for continuous delivery and deployment of Databricks code. • Work independently with minimal supervision and strong ownership of deliverables. Must Have: • 6+ years of experience in Data Engineering on AWS Cloud. • Hands-on expertise in: o Apache Spark (PySpark, SparkSQL) o Delta Lake / Iceberg formats o Databricks on AWS o AWS Glue, Amazon Athena, Amazon Redshift • Strong SQL skills and performance tuning experience on large datasets. • Good understanding of CI/CD pipelines, especially using DBX and AWS tools. • Experience with environment setup, cluster management, user roles, and authentication in Databricks. • Certified as a Databricks Certified Data Engineer – Professional (mandatory). Good To Have: • Experience migrating ETL pipelines from on-premise or other clouds to AWS Databricks. • Experience with Databricks ML or Spark 3.x upgrades. • Familiarity with Airflow, Step Functions, or other orchestration tools. • Experience integrating Databricks with AWS services in a secured, production-ready environment. • Experience with monitoring and cost optimization in AWS. Key Skills: • Languages: Python, SQL, PySpark • Big Data Tools: Apache Spark, Delta Lake, Iceberg • Databricks on AWS • AWS Services: AWS Glue, Athena, Redshift, Lambda, S3, Secrets Manager • Version Control & CI/CD: Git, DBX, AWS CodePipeline/CodeBuild • Other: Data Modeling, ETL Methodology, Performance Optimization
Posted 2 days ago
6.0 years
0 Lacs
India
On-site
A Senior Software Engineer Python is a professional with extensive experience and expertise in Python programming language. They are responsible for designing, developing, and maintaining high-quality software applications using Python and related frameworks. Role and Responsibilities: · R&D on new technologies to solve problems in innovative product · Build new and efficient python libraries rather than using the libraries · Should be working on a start-up mode · Help building SaaS-based Data Fabric and Data Mesh platform that allows companies to seamlessly access, integrate, model, analyze, provision and monetize data. · Must build the product from scratch · Design, develop, and maintain robust and scalable software applications using Python and related technologies. · Collaborate with cross-functional teams, including product managers, designers, and other developers, to define project requirements and deliver high-quality solutions. · Conduct code reviews and ensure adherence to coding best practices and established development standards. · Analyze and optimize application performance, identifying and resolving bottlenecks and inefficiencies. · Stay up-to-date with industry trends and advancements in Python and related technologies, recommending and implementing new tools and frameworks as appropriate. · Collaborate with the DevOps team to automate build, deployment, and testing processes. Technical Skills · 6+ years of extensive experience in Python Programming language. · Good knowledge in Python libraries - Flask, pandas, NumPy etc. · Hands on experience in building microservices applications on AWS/GCP/Azure. · Strong knowledge of software development principles, best practices, and design patterns. · Good understanding of platforms (Kubernetes, Docker, AWS) · Must have solved problems using complex algorithms and data structures · Must have a good understanding of Data Structures/Algorithms/Databases like PostgreSQL · Knowledge in redshift, snowflake would be added advantage · Experience writing APIs and related technologies like REST, JSON etc. · Design, Implement and Deploy scalable backend applications on cloud · Strong knowledge of software development principles, best practices, and design patterns. · Experience with version control systems, such as Git, and collaborative development workflows. Soft Skills · Solid understanding of software testing methodologies and experience with unit testing and integration testing. · Excellent problem-solving and analytical skills, with the ability to quickly identify and resolve issues.
Posted 2 days ago
12.0 - 20.0 years
0 Lacs
karnataka
On-site
You are invited to apply for the position of Executive Manager - Data Engineering with a requirement of 12-20 years of experience. The location for this role is in Bangalore, and the work mode involves being present in the office for 3 out of 5 days. Your primary responsibilities as a Chapter Area Lead will involve collaborating with Crew Leaders to motivate and empower the chapter to align with the squad strategy to enhance customer outcomes. Your focus will be on fostering agility within the Chapter, motivating the squad to embody future ways of working, and ensuring consistent delivery in accordance with the strategic outcomes of the practice area. We are looking for individuals who possess a natural ability to harmonize cultural, people, and technology needs. You should excel in communication, be adept at working across boundaries, sharing challenges, and fostering a unified team spirit across different time zones, cultures, and work methodologies. Your role will also involve supporting Crew Leads with delivery, defining Chapter targets, formulating workforce strategies, and establishing measurable goals. You will be expected to drive a Tech culture of growth and continuous improvement based on data metrics. Proficiency in executive-level stakeholder management to advocate for team ideas and inspire squads to operate with a mindset of constructive challenge towards achieving collective goals is crucial. Identifying technology limitations and deficiencies in existing systems to develop scalable and sustainable long-term solutions will be a key aspect of your role. In terms of skills and experience, we are actively working on developing a best-in-class AWS-based cloud data platform to cater to the data processing and access needs of the CBA group. You should have experience in managing large teams (30+) and holding leadership positions within a data engineering chapter. Expertise in leading platform engineers using AWS services such as EMR, Redshift, and Glue is highly desirable. Furthermore, a background in financial services or a similarly regulated industry with a strong risk-oriented mindset is preferred. Experience in implementing large-scale self-service enterprise data platforms is also highly desirable for this role.,
Posted 2 days ago
2.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category Engineering Experience Manager Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Manager, Data Engineering Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative,inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor’s Degree At least 4 years of experience in application development (Internship experience does not apply) At least 2 years of experience in big data technologies At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud) At least 2 years of people management experience Preferred Qualifications: 7+ years of experience in application development including Python, SQL, Scala, or Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL implementation (Mongo, Cassandra) 4+ years of data warehousing experience (Redshift or Snowflake) 4+ years of experience with UNIX/Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.
Posted 2 days ago
1.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Category Engineering Experience Associate Primary Address Bangalore, Karnataka Overview Voyager (94001), India, Bangalore, Karnataka Associate Data Engineer Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative,inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor’s Degree At least 1.5 years of experience in application development (Internship experience does not apply) At least 1 year of experience in big data technologies Preferred Qualifications: 3+ years of experience in application development including Python, SQL, Scala, or Java 1+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 2+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 1+ years experience working on real-time data and streaming applications 1+ years of experience with NoSQL implementation (Mongo, Cassandra) 1+ years of data warehousing experience (Redshift or Snowflake) 2+ years of experience with UNIX/Linux including basic commands and shell scripting 1+ years of experience with Agile engineering practices At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC). This carousel contains a column of headings. Selecting a heading will change the main content in the carousel that follows. Use the Previous and Next buttons to cycle through all the options, use Enter to select. This carousel shows one item at a time. Use the preceding navigation carousel to select a specific heading to display the content here. How We Hire We take finding great coworkers pretty seriously. Step 1 Apply It only takes a few minutes to complete our application and assessment. Step 2 Screen and Schedule If your application is a good match you’ll hear from one of our recruiters to set up a screening interview. Step 3 Interview(s) Now’s your chance to learn about the job, show us who you are, share why you would be a great addition to the team and determine if Capital One is the place for you. Step 4 Decision The team will discuss — if it’s a good fit for us and you, we’ll make it official! How to Pick the Perfect Career Opportunity Overwhelmed by a tough career choice? Read these tips from Devon Rollins, Senior Director of Cyber Intelligence, to help you accept the right offer with confidence. Your wellbeing is our priority Our benefits and total compensation package is designed for the whole person. Caring for both you and your family. Healthy Body, Healthy Mind You have options and we have the tools to help you decide which health plans best fit your needs. Save Money, Make Money Secure your present, plan for your future and reduce expenses along the way. Time, Family and Advice Options for your time, opportunities for your family, and advice along the way. It’s time to BeWell. Career Journey Here’s how the team fits together. We’re big on growth and knowing who and how coworkers can best support you.
Posted 2 days ago
1.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Voyager (94001), India, Bangalore, Karnataka Associate Data Engineer Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor’s Degree At least 1.5 years of experience in application development (Internship experience does not apply) At least 1 year of experience in big data technologies Preferred Qualifications: 3+ years of experience in application development including Python, SQL, Scala, or Java 1+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 2+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 1+ years experience working on real-time data and streaming applications 1+ years of experience with NoSQL implementation (Mongo, Cassandra) 1+ years of data warehousing experience (Redshift or Snowflake) 2+ years of experience with UNIX/Linux including basic commands and shell scripting 1+ years of experience with Agile engineering practices At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
Posted 2 days ago
2.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Voyager (94001), India, Bangalore, Karnataka Manager, Data Engineering Do you love building and pioneering in the technology space? Do you enjoy solving complex business problems in a fast-paced, collaborative, inclusive, and iterative delivery environment? At Capital One, you'll be part of a big group of makers, breakers, doers and disruptors, who solve real problems and meet real customer needs. We are seeking Data Engineers who are passionate about marrying data with emerging technologies. As a Capital One Data Engineer, you’ll have the opportunity to be on the forefront of driving a major transformation within Capital One. What You’ll Do: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance Basic Qualifications: Bachelor’s Degree At least 4 years of experience in application development (Internship experience does not apply) At least 2 years of experience in big data technologies At least 1 year experience with cloud computing (AWS, Microsoft Azure, Google Cloud) At least 2 years of people management experience Preferred Qualifications: 7+ years of experience in application development including Python, SQL, Scala, or Java 4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) 4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) 4+ year experience working on real-time data and streaming applications 4+ years of experience with NoSQL implementation (Mongo, Cassandra) 4+ years of data warehousing experience (Redshift or Snowflake) 4+ years of experience with UNIX/Linux including basic commands and shell scripting 2+ years of experience with Agile engineering practices At this time, Capital One will not sponsor a new applicant for employment authorization for this position. No agencies please. Capital One is an equal opportunity employer (EOE, including disability/vet) committed to non-discrimination in compliance with applicable federal, state, and local laws. Capital One promotes a drug-free workplace. Capital One will consider for employment qualified applicants with a criminal history in a manner consistent with the requirements of applicable laws regarding criminal background inquiries, including, to the extent applicable, Article 23-A of the New York Correction Law; San Francisco, California Police Code Article 49, Sections 4901-4920; New York City’s Fair Chance Act; Philadelphia’s Fair Criminal Records Screening Act; and other applicable federal, state, and local laws and regulations regarding criminal background inquiries. If you have visited our website in search of information on employment opportunities or to apply for a position, and you require an accommodation, please contact Capital One Recruiting at 1-800-304-9102 or via email at RecruitingAccommodation@capitalone.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodations. For technical support or questions about Capital One's recruiting process, please send an email to Careers@capitalone.com Capital One does not provide, endorse nor guarantee and is not liable for third-party products, services, educational tools or other information available through this site. Capital One Financial is made up of several different entities. Please note that any position posted in Canada is for Capital One Canada, any position posted in the United Kingdom is for Capital One Europe and any position posted in the Philippines is for Capital One Philippines Service Corp. (COPSSC).
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Consultant - Data Engineer at AstraZeneca, you will have the opportunity to contribute to the discovery, development, and commercialization of life-changing medicines by enhancing data platforms built on AWS services. Located at Chennai GITC, you will collaborate with experienced engineers to design and implement efficient data products, supporting data platform initiatives with a focus on impacting patients and saving lives. Your key accountabilities as a Data Engineer will include: Technical Expertise: - Designing, developing, and implementing scalable processes to extract, transform, and load data from various sources into data warehouses. - Demonstrating expert understanding of AstraZeneca's implementation of data products, managing SQL queries and procedures for optimal performance. - Providing support on production issues and enhancements through JIRA. Quality Engineering Standards: - Monitoring and optimizing data pipelines, troubleshooting issues, and maintaining quality standards in design, code, and data models. - Offering detailed analysis and documentation of processes and flows as needed. Collaboration: - Working closely with data engineers to understand data sources, transformations, and dependencies thoroughly. - Collaborating with cross-functional teams to ensure seamless data integration and reliability. Innovation and Process Improvement: - Driving the adoption of new technologies and tools to enhance data engineering processes and efficiency. - Recommending and implementing enhancements to improve reliability, efficiency, and quality of data processing pipelines. To be successful in this role, you should have: - A Bachelor's degree in Computer Science, Information Technology, or a related field. - Strong experience with SQL, warehousing, and building ETL pipelines. - Proficiency in working with column-level databases like Redshift, Cassandra, BigQuery. - Deep SQL knowledge for data extraction, transformation, and reporting. - Excellent communication skills for effective collaboration with technical and non-technical stakeholders. - Strong analytical skills to troubleshoot and deliver solutions in complex data environments. - Experience with Agile Development techniques and methodologies. Desirable skills and experience include knowledge of Databricks/Snowflake, proficiency in scripting and programming languages like Python, experience with reporting tools such as PowerBI, and prior experience in Pharmaceutical or Healthcare industry IT environments. Join AstraZeneca's dynamic team to drive cross-company change and disrupt the industry while making a direct impact on patients through innovative data solutions and technologies. Apply now to be part of our ambitious journey towards becoming a digital and data-led enterprise.,
Posted 2 days ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Engineer at our Pune location, you will play a critical role in designing, developing, and maintaining scalable data pipelines and architectures using Data bricks on Azure/AWS cloud platforms. With 6 to 9 years of experience in the field, you will collaborate with stakeholders to integrate large datasets, optimize performance, implement ETL/ELT processes, ensure data governance, and work closely with cross-functional teams to deliver accurate solutions. Your responsibilities will include building, maintaining, and optimizing data workflows, integrating datasets from various sources, tuning pipelines for performance and scalability, implementing ETL/ELT processes using Spark and Data bricks, ensuring data governance, collaborating with different teams, documenting data pipelines, and developing automated processes for continuous integration and deployment of data solutions. To excel in this role, you should have 6 to 9 years of hands-on experience as a Data Engineer, expertise in Apache Spark, Delta Lake, Azure/AWS Data bricks, proficiency in Python, Scala, or Java, advanced SQL skills, experience with cloud data platforms, data warehousing solutions, data modeling, ETL tools, version control systems, and automation tools. Additionally, soft skills such as problem-solving, attention to detail, and ability to work in a fast-paced environment are essential. Nice to have skills include experience with Data bricks SQL and Data bricks Delta, knowledge of machine learning concepts, and experience in CI/CD pipelines for data engineering solutions. Joining our team offers challenging work with international clients, growth opportunities, a collaborative culture, and global project involvement. We provide competitive salaries, flexible work schedules, health insurance, performance-based bonuses, and other standard benefits. If you are passionate about data engineering, possess the required skills and qualifications, and thrive in a dynamic and innovative environment, we welcome you to apply for this exciting opportunity.,
Posted 2 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Quality Engineer, your primary responsibility will be to analyze business and technical requirements to design, develop, and execute comprehensive test plans for ETL pipelines and data transformations. You will perform data validation, reconciliation, and integrity checks across various data sources and target systems. Additionally, you will be expected to build and automate data quality checks using SQL and/or Python scripting. It will be your duty to identify, document, and track data quality issues, anomalies, and defects. Collaboration is key in this role, as you will work closely with data engineers, developers, QA, and business stakeholders to understand data requirements and ensure that data quality standards are met. You will define data quality KPIs and implement continuous monitoring frameworks. Participation in data model reviews and providing input on data quality considerations will also be part of your responsibilities. In case of data discrepancies, you will be expected to perform root cause analysis and work with teams to drive resolution. Ensuring alignment to data governance policies, standards, and best practices will also fall under your purview. To qualify for this position, you should hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Additionally, you should have 4 to 7 years of experience as a Data Quality Engineer, ETL Tester, or a similar role. A strong understanding of ETL concepts, data warehousing principles, and relational database design is essential. Proficiency in SQL for complex querying, data profiling, and validation tasks is required. Familiarity with data quality tools, testing methodologies, and modern cloud data ecosystems (AWS, Snowflake, Apache Spark, Redshift) will be advantageous. Moreover, advanced knowledge of SQL, data pipeline tools like Airflow, DBT, or Informatica, as well as experience with integrating data validation processes into CI/CD pipelines using tools like GitHub Actions, Jenkins, or similar, are desired qualifications. An understanding of big data platforms, data lakes, non-relational databases, data lineage, master data management (MDM) concepts, and experience with Agile/Scrum development methodologies will be beneficial for excelling in this role. Your excellent analytical and problem-solving skills along with a strong attention to detail will be valuable assets in fulfilling the responsibilities of a Data Quality Engineer.,
Posted 2 days ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. You will be part of a team of highly skilled professionals working with cutting-edge technologies. Our purpose is to bring real positive changes in an increasingly virtual world, transcending generational gaps and disruptions of the future. We are seeking AWS Glue Professionals with the following qualifications: - 3 or more years of experience in AWS Glue, Redshift, and Python - 3+ years of experience in engineering with expertise in ETL work with cloud databases - Proficiency in data management and data structures, including writing code for data reading, transformation, and storage - Experience in launching spark jobs in client mode and cluster mode, with knowledge of spark job property settings and their impact on performance - Proficiency with source code control systems like Git - Experience in developing ELT/ETL processes for loading data from enterprise-sized RDBMS systems such as Oracle, DB2, MySQL, etc. - Coding proficiency in Python or expertise in high-level languages like Java, C, Scala - Experience in using REST APIs - Expertise in SQL for manipulating database data, familiarity with views, functions, stored procedures, and exception handling - General knowledge of AWS Stack (EC2, S3, EBS), IT Process Compliance, SDLC experience, and formalized change controls - Working in DevOps teams based on Agile principles (e.g., Scrum) - ITIL knowledge, especially in incident, problem, and change management - Proficiency in PySpark for distributed computation - Familiarity with Postgres and ElasticSearch At YASH, you will have the opportunity to build a career in an inclusive team environment. We offer career-oriented skilling models and leverage technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our workplace is grounded in four principles: - Flexible work arrangements, free spirit, and emotional positivity - Agile self-determination, trust, transparency, and open collaboration - Support for the realization of business goals - Stable employment with a great atmosphere and ethical corporate culture.,
Posted 2 days ago
10.0 years
0 Lacs
Greater Kolkata Area
Remote
Java Back End Engineer with AWS Location : Remote Experience : 10+ Years Employment Type : Full-Time Job Overview We are looking for a highly skilled Java Back End Engineer with strong AWS cloud experience to design and implement scalable backend systems and APIs. You will work closely with cross-functional teams to develop robust microservices, optimize database performance, and contribute across the tech stack, including infrastructure automation. Core Responsibilities Design, develop, and deploy scalable microservices using Java, J2EE, Spring, and Spring Boot. Build and maintain secure, high-performance APIs and backend services on AWS or GCP. Use JUnit and Mockito to ensure test-driven development and maintain code quality. Develop and manage ETL workflows using tools like Pentaho, Talend, or Apache NiFi. Create High-Level Design (HLD) and architecture documentation for system components. Collaborate with cross-functional teams (DevOps, Frontend, QA) as a full-stack contributor when needed. Tune SQL queries and manage performance on MySQL and Amazon Redshift. Troubleshoot and optimize microservices for performance and scalability. Use Git for source control and participate in code reviews and architectural discussions. Automate infrastructure provisioning and CI/CD processes using Terraform, Bash, and pipelines. Primary Skills Languages & Frameworks : Java (v8/17/21), Spring Boot, J2EE, Servlets, JSP, JDBC, Struts Architecture : Microservices, REST APIs Cloud Platforms : AWS (EC2, S3, Lambda, RDS, CloudFormation, SQS, SNS) or GCP Databases : MySQL, Redshift Secondary Skills (Good To Have) Infrastructure as Code (IaC) : Terraform Additional Languages : Python, Node.js Frontend Frameworks : React, Angular, JavaScript ETL Tools : Pentaho, Talend, Apache NiFi (or equivalent) CI/CD & Containers : Jenkins, GitHub Actions, Docker, Kubernetes Monitoring/Logging : AWS CloudWatch, DataDog Scripting : Bash, Shell scripting Nice To Have Familiarity with agile software development practices Experience in a cross-functional engineering environment Exposure to DevOps culture and tools (ref:hirist.tech)
Posted 2 days ago
3.0 years
0 Lacs
Ahmedabad, Gujarat, India
Remote
Job Title: BI Engineer – Amazon QuickSight Developer Location: Onsite | Remote Job Summary We are seeking an experienced Amazon QuickSight Developer to join our BI team. This role requires deep expertise in designing and deploying intuitive, high-impact dashboards and managing all aspects of QuickSight administration. You’ll collaborate closely with data engineers and business stakeholders to create scalable BI solutions that empower data-driven decisions across the organization. Key Responsibilities Dashboard Development & Visualization Design, develop, and maintain interactive QuickSight dashboards using advanced visuals, parameters, and controls. Create reusable datasets and calculated fields using both SPICE and Direct Query modes. Implement advanced analytics such as level-aware calculations, ranking, period-over-period comparisons, and custom KPIs. Build dynamic, user-driven dashboards with multi-select filters, dropdowns, and custom date ranges. Optimize performance and usability to maximize business value and user engagement. QuickSight Administration Manage users, groups, and permissions through QuickSight and AWS IAM roles. Implement and maintain row-level security (RLS) to ensure appropriate data access. Monitor usage, SPICE capacity, and subscription resources to maintain system performance. Configure and maintain themes, namespaces, and user interfaces for consistent experiences. Work with IT/cloud teams on account-level settings and AWS integrations. Collaboration & Data Integration Partner with data engineers and analysts to understand data structures and business needs. Integrate QuickSight with AWS services such as Redshift, Athena, S3, and Glue. Ensure data quality and accuracy through robust data modeling and SQL optimization. Required Skills & Qualifications 3+ years of hands-on experience with Amazon QuickSight (development and administration). Strong SQL skills and experience working with large, complex datasets. Expert-level understanding of QuickSight security, RLS, SPICE management, and user/group administration. Strong sense of data visualization best practices and UX design principles. Proficiency with AWS data services including Redshift, Athena, S3, Glue, and IAM. Solid understanding of data modeling and business reporting frameworks. Nice To Have Experience with Python, AWS Lambda, or automating QuickSight administration via SDK or CLI. Familiarity with modern data stack tools (e.g., dbt, Snowflake, Tableau, Power BI). Apply Now If you’re passionate about building scalable BI solutions and making data, come alive through visualization, we’d love to hear from you!
Posted 2 days ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
As an AWS Senior Data Engineer (SDE) at Infosys in India, you will be responsible for working on various technologies and tools related to cloud data engineering. Your role will involve expertise in SQL, Pyspark, API endpoint ingestion, Glue, S3, Redshift, Step Functions, Lambda, Cloudwatch, AppFlow, CloudFormation, and administrative tasks related to cloud services. Additionally, you will be expected to have knowledge of SDLF & OF frameworks, S3 ingestion patterns, and exposure to Git, Jfrog, ADO, SNOW, Visual Studio, DBeaver, and SF inspector. Your primary focus will be on leveraging these technologies to design, develop, and maintain data pipelines, ensuring efficient data processing and storage on the cloud platform. The ideal candidate for this position should have a strong background in cloud data engineering, familiarity with AWS services, and a proactive attitude towards learning and implementing new technologies. Excellent communication skills and the ability to work effectively within a team are essential for success in this role.,
Posted 2 days ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
About The Role Were looking for a skilled and motivated Database Administrator (DBA) to join our team and support both on-premises SQL Server databases and cloud-based environments in AWS (RDS, Redshift). Youll be responsible for ensuring database performance, stability, and scalability. This role is ideal for someone who thrives in both traditional database environments and cloud-native architectures, enjoys problem-solving, and takes ownership of incident response and continuous improvement. Responsibilities Administer, configure, and maintain Microsoft SQL Server (on-premises) and AWS RDS/Redshift environments. Perform regular maintenance tasks such as backups, restores, patching, and capacity planning. Manage database security, user access, and roles across environments. Provision, configure, and manage AWS RDS (SQL Server, PostgreSQL) and Redshift instances. Implement backup strategies, monitoring, and disaster recovery solutions in the cloud. Automate routine database tasks and processes using AWS tools and scripting. Deploy and monitor AWS Glue and AWS Lambda. Troubleshoot ETL job failures, ensure data quality, and support the timely delivery of data. Use tools like AWS CloudWatch, SolarWinds, and Redgate SQL Monitor for real-time performance tracking and alerting. Identify and resolve performance bottlenecks in SQL queries, indexes, and server configurations. Act as a point of contact for database-related incidents and outages. Perform root cause analysis, document findings, and work with engineering teams to implement long-term fixes. Maintain comprehensive and up-to-date documentation on database systems, configurations, and procedures. Collaborate with development and DevOps teams to support database and data platform needs. Contribute to automation and infrastructure improvements in cloud and hybrid environments. Maintain detailed documentation and knowledge base articles for internal teams. Qualifications Experience as a database administrator, with a strong foundation in SQL Server administration, backup/restore strategies, and high availability solutions (e.g., Always On, clustering). Hands-on experience managing AWS RDS (SQL Server/PostgreSQL) and Amazon Redshift, including provisioning, scaling, backups, snapshots, and security configurations. Proficiency with monitoring tools like AWS CloudWatch, SolarWinds, and Redgate SQL Monitor, with the ability to configure alerts, identify trends, and proactively address performance bottlenecks. Expertise in performance tuning for : SQL Server : Execution plan analysis, indexing strategies, TempDB optimization, query tuning. RDS : Parameter group tuning, performance insights, instance sizing. Redshift : WLM configuration, vacuum/analyze, distribution/sort keys, and query optimization. Strong understanding of database security best practices, user access controls, encryption, and auditing. Experience managing incident response, including root cause analysis, mitigation planning, and follow-up documentation. Ability to create and maintain detailed runbooks, SOPs, and knowledge base articles for repeatable processes and troubleshooting procedures. Comfortable working in hybrid environments, with coordination across on-premises and cloud-based systems. Familiarity with automation and scripting using PowerShell, Python, or Bash to streamline database tasks and monitoring. Hands-on experience with CI/CD pipelines to support database changes and deployments using tools like AWS CodePipeline or GitLab CI. Experience integrating database deployments into DevOps pipelines, including version-controlled DDL/DML scripts, pre-deployment checks, and rollback strategies. Ability to perform manual deployments when required (e.g., via SSMS, pgAdmin, or SQL scripts) while adhering to change management processes. Ability to work independently, manage priorities, and take ownership of tasks in a distributed team environment. Strong communication and interpersonal skills, with the ability to explain technical concepts clearly to both technical and non-technical stakeholders. A proactive and detail-oriented mindset, with a focus on continuous improvement and system reliability. (ref:hirist.tech)
Posted 2 days ago
2.0 - 6.0 years
0 Lacs
delhi
On-site
As a Texturing & Rendering Artist at our animation studio located in Saraswati Vihar, New Delhi, you will be responsible for creating high-quality textures and performing visually appealing rendering for a variety of 3D assets. Your collaboration with modelers, lighting, and compositing teams will be crucial to ensure texture quality and consistency across projects. Additionally, you will optimize texture maps and render settings to achieve fast outputs without compromising on quality, all while ensuring that the final output aligns with the project requirements and aesthetic vision. To excel in this role, you should possess strong skills in texturing, including both PBR and stylized techniques, using industry-standard tools such as Substance Painter, Photoshop, and Mari. Proficiency in rendering with render engines like Arnold, V-Ray, or Redshift is essential. A basic understanding of UV mapping and shading techniques will also be beneficial in executing your responsibilities effectively. Moreover, your ability to incorporate feedback and implement improvements swiftly will contribute to the overall success of the projects. While prior experience in animation, film, or gaming projects is considered a plus, we value individuals who demonstrate a passion for their craft and a willingness to learn and grow. In return, we offer a dynamic and collaborative work environment where you will have the opportunity to work on exciting 3D animated projects and contribute to the growth of our rapidly scaling animation studio. If you are interested in joining our team as a Texturing & Rendering Artist, please send your resume and portfolio to hr@blackdiamonds.co.in with the subject line "Application Texturing & Rendering Artist." We look forward to reviewing your application and potentially welcoming you to our creative team.,
Posted 2 days ago
6.0 - 8.0 years
0 Lacs
India
On-site
At Cision, we believe in empowering every individual to make an impact. Here, your voice is heard, your ideas are valued, and your unique perspective fuels our collective success. As part of our global team, you'll thrive in an environment that champions curiosity, collaboration, and innovation, all while making meaningful contributions to the brands we accelerate. Join us in shaping the future of communication and building authentic connections that matter. Whether you're solving complex problems or driving bold innovations, your growth is our success, and together, we’ll create the conversations of tomorrow. Empower your impact at Cision. Be seen, be understood, be you. Job Summary The Manager, Business Intelligence position participates in the development of a data strategy to quickly cultivate a data-driven culture across the organization and to optimize our business performance by identifying growth opportunities and highlighting areas for improvement. The role will proactively communicate with stakeholders, team members and partners to support a high-performing team responsible for providing sales intelligence and data visualizations by leveraging business intelligence tools. Essential Duties And Responsibilities Coordinate and align priorities with the organization's strategic goals, partnering with business leadership to identify data and analytical needs via Value Approval Deliver the business intelligence strategy that combines data visualization to make profitable, data-driven decisions As a backbone to all things BI, establish and maintain high data integrity, quality, and governance standards. Craft data management practices are in place to support accurate and reliable data analysis. Develop dashboards that provide up to date information to sales leaders and sales associates on KPIs and other business objectives and goals. Translate intricate datasets into intuitive and insightful visualizations that drive databased decision-making across the organization. Distill insights from data and communicate recommendations to business customers Oversee the selection, implementation, and management of BI tools and technologies. Lead the creation and maintenance of reports, dashboards, and other data visualizations. Translating raw data into visual contexts that is easy for business customers to interpret Oversee BI projects/enhancements from inception to completion, ensuring they are delivered on time and within budget. Present data insights to stakeholders and business leaders clearly and in a relatable way Influence key decisions that would affect business decisions Maintain an accurate data portfolio that includes high quality dashboards and data models Mentor and upskill team members including data analysts, system admin and BI Developers Participate in the exploration and evaluation of emerging reporting tools, technologies, and methodologies to drive innovation and leverage best practices to advance the organization's BI capabilities. Minimum Required Qualifications Bachelor’s degree in computer science, Information Systems, Business Administration, or a related field. A master’s degree or an MBA can be advantageous. 6 - 8 years of experience in Data Management or Visualization 6- 8 years of experience in a high-functioning, fast-paced work environment with strong business acumen. 3-5 years of people leaders with high social intelligence Manage and mentor junior data analysts and BI developers. Proven expertise in executing data management, reporting & visualization in Domo. Secondarily Power BI and Tableau. Experience with Amazon Redshift and DBT desired. Proficient in Microsoft Office Suite Knowledge of complex data integration from multiple data sources Experience with Statistics and Probability Excellent verbal and written communication skills to translate complex data into easyto-understand, practical terms that every person can understand Deep understanding of data governance, compliance and privacy best practices Agility to changing priorities and situations High attention to detail and accuracy To be successful in this role, candidates must have demonstrated experience in organizing data in a way that allows business leaders to make informed decisions and reach their full potential by leveraging timely and accurate data. As a global leader in PR, marketing and social media management technology and intelligence, Cision helps brands and organizations to identify, connect and engage with customers and stakeholders to drive business results. PR Newswire, a network of over 1.1 billion influencers, in-depth monitoring, analytics and its Brandwatch and Falcon.io social media platforms headline a premier suite of solutions. Cision has offices in 24 countries throughout the Americas, EMEA and APAC. For more information about Cision's award-winning solutions, including its next-gen Cision Communications Cloud®, visit www.cision.com and follow @Cision on Twitter. Cision is committed to fostering an inclusive environment where all employees can be their authentic selves and perform at their best. We believe diversity, equity, and inclusion is vital to driving our culture, sparking innovation and achieving long-term success. Cision is proud to have joined more than 600 companies in signing the CEO Action for Diversity & Inclusion™ pledge and named a “Top Diversity Employer” for 2021 by DiversityJobs.com. Cision is proud to be an equal opportunity employer, seeking to create a welcoming and diverse environment. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, gender identity or expression, sexual orientation, national origin, genetics, disability, age, veteran status, or other protected statuses. Cision is committed to the full inclusion of all qualified individuals. In keeping with our commitment, Cision will take the steps to assure that people with disabilities are provided reasonable accommodations. Accordingly, if reasonable accommodation is required to fully participate in the job application or interview process, to perform the essential functions of the position, and/or to receive all other benefits and privileges of employment, please contact hr.support@cision.com Please review our Global Candidate Data Privacy Statement to learn about Cision’s commitment to protecting personal data collected during the hiring process.
Posted 2 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough