Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Motion Graphic Artist 2 - 4 Years About the Role We are looking for a talented Motion Graphic Designer with 2-4 years of experience to join our creative team. The ideal candidate should be passionate about motion graphics and visual storytelling. You will be responsible for creating high-quality motion graphics, and visual effects for various digital platforms, advertisements, and branded content. Key Responsibilities Create high-quality motion graphics, and visual effects for videos, advertisements, and digital content. Develop storyboards, animatics, and style frames to visualize creative concepts. Collaborate closely with designers, video editors, and creative teams to bring ideas to life. Optimize assets for performance and visual quality across different platforms. Stay up-to-date with the latest animation trends, tools, and technologies to enhance creative outputs. Work with tight deadlines while maintaining high-quality production standards. Requirements 2-4 years of experience as Motion Graphic Designer and visual effects. Expertise in Adobe After Effects, Premiere Pro, Spine, and Photoshop. Knowledge of render engines like Redshift, Octane, or Arnold is a plus. Understanding of compositing a video. Experience with motion capture, particle effects, and dynamics is a plus. Strong artistic and storytelling skills with an eye for detail. Ability to work collaboratively in a fast-paced environment and meet deadlines. A portfolio or demo reel showcasing the work is mandatory. Preferred Qualifications Bachelor’s degree or diploma in Visual Arts, Multimedia, or a related field. Experience working in an advertising agency, gaming studio, or creative production house is a plus. Knowledge of VR/AR animation and interactive 3D experiences is an added advantage. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Tamil Nadu, India
On-site
Job Title: Data Engineer About VXI VXI Global Solutions is a BPO leader in customer service, customer experience, and digital solutions. Founded in 1998, the company has 40,000 employees in more than 40 locations in North America, Asia, Europe, and the Caribbean. We deliver omnichannel and multilingual support, software development, quality assurance, CX advisory, and automation & process excellence to the world’s most respected brands. VXI is one of the fastest growing, privately held business services organizations in the United States and the Philippines, and one of the few US-based customer care organizations in China. VXI is also backed by private equity investor Bain Capital. Our initial partnership ran from 2012 to 2016 and was the beginning of prosperous times for the company. During this period, not only did VXI expand our footprint in the US and Philippines, but we also gained ground in the Chinese and Central American markets. We also acquired Symbio, expanding our global technology services offering and enhancing our competitive position. In 2022, Bain Capital re-invested in the organization after completing a buy-out from Carlyle. This is a rare occurrence in the private equity space and shows the level of performance VXI delivers for our clients, employees, and shareholders. With this recent investment, VXI has started on a transformation to radically improve the CX experience though an industry leading generative AI product portfolio that spans hiring, training, customer contact, and feedback. Job Description: We are seeking talented and motivated Data Engineers to join our dynamic team and contribute to our mission of harnessing the power of data to drive growth and success. As a Data Engineer at VXI Global Solutions, you will play a critical role in designing, implementing, and maintaining our data infrastructure to support our customer experience and management initiatives. You will collaborate with cross-functional teams to understand business requirements, architect scalable data solutions, and ensure data quality and integrity. This is an exciting opportunity to work with cutting-edge technologies and shape the future of data-driven decision-making at VXI Global Solutions. Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to ingest, transform, and store data from various sources. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Implement data models and schemas to support analytics, reporting, and machine learning initiatives. Optimize data processing and storage solutions for performance, scalability, and cost-effectiveness. Ensure data quality and integrity by implementing data validation, monitoring, and error handling mechanisms. Collaborate with data analysts and data scientists to provide them with clean, reliable, and accessible data for analysis and modeling. Stay current with emerging technologies and best practices in data engineering and recommend innovative solutions to enhance our data capabilities. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven 8+ years' experience as a data engineer or similar role Proficiency in SQL, Python, and/or other programming languages for data processing and manipulation. Experience with relational and NoSQL databases (e.g., SQL Server, MySQL, Postgres, Cassandra, DynamoDB, MongoDB, Oracle), data warehousing (e.g., Vertica, Teradata, Oracle Exadata, SAP Hana), and data modeling concepts. Strong understanding of distributed computing frameworks (e.g., Apache Spark, Apache Flink, Apache Storm) and cloud-based data platforms (e.g., AWS Redshift, Azure, Google BigQuery, Snowflake) Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker, Apache Superset) and data pipeline tools (e.g. Airflow, Kafka, Data Flow, Cloud Data Fusion, Airbyte, Informatica, Talend) is a plus. Understanding of data and query optimization, query profiling, and query performance monitoring tools and techniques. Solid understanding of ETL/ELT processes, data validation, and data security best practices Experience in version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills to work effectively with cross-functional teams. Join VXI Global Solutions and be part of a dynamic team dedicated to driving innovation and delivering exceptional customer experiences. Apply now to embark on a rewarding career in data engineering with us! Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description And Requirements CareerArc Code CA-DN Hybrid "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! BU Description We are the Technology and Automation team that drives competitive Advantage for BMC by enabling recurring revenue growth, customer centricity, operational efficiency and transformation through actionable insights, focused operational execution, and obsessive value realization. About You You are a self-motivated, proactive individual who thrives in a fast-paced environment. You have a strong eagerness to learn and grow, continuously staying updated with the latest trends and technologies in data engineering. Your passion for collaboration makes you a valuable team player, contributing to a positive work culture while also guiding and mentoring junior team members. You’re excited about problem-solving and have the ability to take ownership of projects from start to finish. With a keen interest in data-driven decision-making, you are ready to work on cutting-edge solutions that have a direct impact on the business. Role And Responsibilities As a Data Engineer, you will play a crucial role in leading and managing strategic data initiatives across the business. Your responsibilities will include: Leading data engineering projects across key business functions, including Marketing, Sales, Customer Success, and Product R&D. Developing and maintaining data pipelines to extract, transform, and load (ETL) data into data warehouses or data lakes. Designing and implementing ETL processes, ensuring the integrity, scalability, and performance of the data architecture. Leading data modeling efforts, ensuring that data is structured for optimal performance and that security best practices are maintained. Collaborating with data scientists, analysts, and stakeholders to understand data requirements and provide valuable insights across the customer journey. Guiding and mentoring junior engineers, providing technical leadership and ensuring best practices are followed. Maintaining documentation for data structures, ETL processes, and data lineage, ensuring clarity and ease of understanding across the team. Developing and maintaining data security, compliance, and retention protocols as part of best practice initiatives. Professional Expertise Must-Have Skills 5+ years of experience in data engineering, data warehousing, and building enterprise-level data integrations. Proficiency in SQL, including query optimization and tuning for relational databases (Snowflake, MS SQL Server, RedShift, etc.). 2+ years of experience working with cloud platforms (AWS, GCP, Azure, or OCI). Expertise in Python and Spark for data extraction, manipulation, and data pipeline development. Experience with structured, semi-structured, and unstructured data formats (JSON, XML, Parquet, CSV). Familiarity with version control systems (Git, Bitbucket) and Agile methodologies (Jira). Ability to collaborate with data scientists and business analysts, providing data support and insights. Proven ability to work effectively in a team setting, balancing multiple projects, and leading initiatives. Nice-to-Have Skills Experience in the SaaS software industry. Knowledge of analytics governance, data literacy, and core visualization tools (Tableau, MicroStrategy). Familiarity with CRM and marketing automation tools (Salesforce, HubSpot, Eloqua). Education Bachelor’s or master’s degree in computer science, Information Systems, or a related field (Advanced degree preferred). BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 2,033,200 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply. Min salary 1,524,900 Our commitment to you! BMC’s culture is built around its people. We have 6000+ brilliant minds working together across the globe. You won’t be known just by your employee number, but for your true authentic self. BMC lets you be YOU! If after reading the above, You’re unsure if you meet the qualifications of this role but are deeply excited about BMC and this team, we still encourage you to apply! We want to attract talents from diverse backgrounds and experience to ensure we face the world together with the best ideas! BMC is committed to equal opportunity employment regardless of race, age, sex, creed, color, religion, citizenship status, sexual orientation, gender, gender expression, gender identity, national origin, disability, marital status, pregnancy, disabled veteran or status as a protected veteran. If you need a reasonable accommodation for any part of the application and hiring process, visit the accommodation request page. Mid point salary 2,033,200 Max salary 2,541,500 Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About Us : SentiLink provides innovative identity and risk solutions, empowering institutions and individuals to transact confidently with one another. By building the future of identity verification in the United States and reinventing the currently clunky, ineffective, and expensive process, we believe strongly that the future will be 10x better. We’ve had tremendous traction and are growing extremely quickly. Already our real-time APIs have helped verify hundreds of millions of identities, beginning with financial services. In 2021, we raised a $70M Series B round, led by Craft Ventures to rapidly scale our best in class products. We’ve earned coverage and awards from TechCrunch, CNBC, Bloomberg, Forbes, Business Insider, PYMNTS, American Banker, LendIt, and have been named to the Forbes Fintech 50 list consecutively since 2023. Last but not least, we’ve even been a part of history -- we were the first company to go live with the eCBSV and testified before the United States House of Representatives . About the Role: Are you passionate about creating world-class solutions that fuel product stability and continuously improve infrastructure operations? We’re looking for a driven Infrastructure Engineer to architect, implement, and maintain powerful observability systems that safeguard the performance and reliability of our most critical systems. In this role, you’ll take real ownership—collaborating with cross-functional teams to shape best-in-class observability standards, troubleshoot complex issues, and fine-tune monitoring tools to exceed SLA requirements. If you’re ready to design high-quality solutions, influence our technology roadmap, and make a lasting impact on our product’s success, we want to meet you! This is a full-time, in-office role based in Gurugram, India. Responsibilities: Improve alerting across SentiLink systems and services, developing high quality monitoring capabilities while actively reducing false positives. Troubleshoot, debug, and resolve infrastructure issues as they arise; participate in on-call rotations for production issues. Define and refine Service Level Indicators (SLI), Service Level Objectives (SLO), and Service Level Agreements (SLA) in collaboration with product and engineering teams. Develop monitoring and alerting configurations using IaC solutions such as Terraform. Build and maintain dashboards to provide visibility into system performance and reliability. Collaborate with engineering teams to improve root cause analysis processes and reduce Mean Time to Recovery (MTTR). Drive cost optimization for observability tools like Datadog, CloudWatch, and Sumo Logic. Perform capacity testing to determine a deep understanding of infrastructure performance under load. Develop alerting based on learnings. Oversee, develop, and operate Kubernetes and service mesh infrastructure, ensuring smooth performance and reliability Investigate operational alerts, identify root causes, and compile comprehensive root cause analysis reports. Pursue action items relentlessly until they are thoroughly completed Conduct in-depth examinations of database operational issues, actively developing and improving database architecture, schema, and configuration for enhanced performance and reliability Develop and maintain incident response runbooks and improve processes to minimize service downtime. Research and evaluate new observability tools and technologies to enhance system monitoring. Requirements: 5 years of experience in cloud infrastructure, DevOps, or systems engineering. Expertise in AWS and infrastructure-as-code development. Experience with CI/CD pipelines and automation tools. Experience managing observability platforms, building monitoring dashboards, and configuring high quality, actionable alerting Strong understanding of Linux systems and networking. Familiarity with container orchestration tools like Kubernetes or Docker. Excellent analytical and problem-solving skills. Experience operating enterprise-size databases. Postgres, Aurora, Redshift, and OpenSearch experience is a plus Experience with Python or Golang is a plus Perks: Employer paid group health insurance for you and your dependents Regular company-wide in-person events Home office stipend, and more! Corporate Values: Follow Through Deep Understanding Whatever It Takes Do Something Smart Show more Show less
Posted 1 week ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Hi we are looking for Business Analyst , the selected candidate will be hired for our client (Amazon) but will be on our payroll. Location - Rajajinagar, Bangalore Duration - 6months contract Salary - Upto 6LPA Experience - 2years KEY JOB RESPONSIBILITIES This person will own the production and delivery of suite of analytics reports and dashboards used by the team to make key business decisions. This will involve: 1. Build the data structure, transformation processes, load jobs in Redshift and data processing and presentation in Excel 2. Debugging report issues and unblocking workflows 3. Communicating with the Product team and customers to provide status updates. 4. Publishing detailed automated dashboards 5. Creating the report requires extracting & transforming data from tables and loading it into tables with an optimized data structure BASIC QUALIFICATIONS - Bachelor's degree in mathematics, engineering, statistics, computer science or a related field - 1+ years of business analysis (dealing with large complex data) experience - Demonstrated ability with Data-warehousing, database administrator roles, database migration - Strong experience in dash-boarding using Tableue/PowerBI/Excel/PowerPivots - Strong communication skill and team player - Demonstrated ability to manage and prioritize workload PREFERRED QUALIFICATIONS - Knowledge of scripting for automation (e.g., VB Script, Python, Perl, Ruby) is a plus Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Responsibilities will include Functional Expertise Collaborating closely with multiple teams to translate the requirements into technical specifications. Offering clear technical guidance and direction to ensure solutions meet user and tech requirements. Leading technical discussions, code reviews to maintain code quality, identify improvement opportunities, and ensure adherence to standards. Staying updated on the latest data engineering trends and applying them to solve complex challenges. Problem Solving & Communication : Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Ability to work independently and as part of a team. Providing guidance and mentorship to other team members , fostering their professional growth and skill development. Experience working with Fintech institutions is a plus. Qualification & Experience (type & industry) B.Tech Degree Skills & know-how Experience level: 3-5 years Minimum 2 years relevant experience in AWS Cloud Data Warehousing experience - Cloud data warehouse - Redshift/SQL In-memory framework exp - Pyspark Data engineering pipeline use case experience : Ingestion of data from different sources to Cloud file system(S3 buckets) and then transforming/processing the data using AWS Glue and finally loading the same to cloud warehouse for Data analytics Big data use cases : Exposure to huge data volumes involving TBs of data for storage/migration/processing. Programming experience in Python Familiarity with Reports/Dashboards using cloud native applications Knowledge of data pipeline orchestration using Airflow - good to have. Understanding of API development Show more Show less
Posted 1 week ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
As the Techno-Functional Lead, you will be responsible for coordinating with various stakeholders managing projects, feasibility, prioritization, initiation, execution, audit, compliance and closure for all project work related to the program. Required to handle complex solution designing, integration, co-ordination, and support between As the Techno-Functional Lead, you will be responsible for coordinating with various stakeholders managing projects, feasibility, prioritization, initiation, execution, audit, compliance and closure for all project work related to the program. Required to handle complex solution designing, integration, co-ordination, and support between Mobile Wallet (Payzapp), Credit card (PIXEL) & banking application and integration with core systems like Flexcube and other core products . Performing the full audit cycle including risk management and control management over operations’ effectiveness, financial reliability and compliance with all applicable directives and regulations Prepare and present reports that reflect audit’s results and document process Co-ordinate with various Internal audit, compliance, VAPT, security, Csite and various teams towards the application audit cycles. Proven knowledge of auditing standards and procedures, laws, rules and regulations, db baseline, os patching, cloud resources compliance Knowledge on ISO8583 messaging standards, Visa, MCE Interchanges, Debit and Credit cards, Mobile banking is essential, HSM Hardware security modules processes and key management is essential Organizes reports to comply with applicable guidelines and provides documentation to support Clearly and concisely communicates (oral and written) audit findings and recommendations to the stakeholders. Understanding of AWS cloud technology and AWS services Conduct follow up audits to monitor management’s interventions Should have strong customer handling and user management skills. Should work with minimal supervision and carry end-to-end ownership of deliverables. Should track and report progress proactively. Job Responsibilities (JR) • Co-ordinate and manage Mobile Wallet, switching, Card Issuance/Management, Digital Channels interface application • Familiarity and hands-on experience on Audit, Compliance, PCI DSS, CSITE, SAR audit, security, Internal & External VAPT, FOSS, Pre & Post go live observations remediations, PRISMA compliance requirement is mandatory • Proven experience in setting up and managing various stakeholders including internal engineering teams, vendor teams, risk and compliance teams preferably in the digital and financial services industry. • Proficiency in Agile methodologies and experience in leading Agile transformations within development teams. • Excellent leadership and communication skills, with the ability to collaborate effectively with cross-functional teams and stakeholders. • Strong problem-solving and decision-making abilities, with a proactive and results-driven approach. • Good understanding of AWS Cloud Services EC2, EKS, RDS, Redshift, Kafka services & banking application and integration with core systems like Flexcube and other core products. Performing the full audit cycle including risk management and control management over operations’ effectiveness, financial reliability and compliance with all applicable directives and regulations Prepare and present reports that reflect audit’s results and document process Co-ordinate with various Internal audit, compliance, VAPT, security, Csite and various teams towards the application audit cycles. Proven knowledge of auditing standards and procedures, laws, rules and regulations, db baseline, os patching, cloud resources compliance Knowledge on ISO8583 messaging standards, Visa, MCE Interchanges, Debit and Credit cards, Mobile banking is essential, HSM Hardware security modules processes and key management is essential Organizes reports to comply with applicable guidelines and provides documentation to support Clearly and concisely communicates (oral and written) audit findings and recommendations to the stakeholders. Understanding of AWS cloud technology and AWS services Conduct follow up audits to monitor management’s interventions Should have strong customer handling and user management skills. Should work with minimal supervision and carry end-to-end ownership of deliverables. Should track and report progress proactively. Job Responsibilities (JR) • Co-ordinate and manage Mobile Wallet, switching, Card Issuance/Management, Digital Channels interface application • Familiarity and hands-on experience on Audit, Compliance, PCI DSS, CSITE, SAR audit, security, Internal & External VAPT, FOSS, Pre & Post go live observations remediations, PRISMA compliance requirement is mandatory • Proven experience in setting up and managing various stakeholders including internal engineering teams, vendor teams, risk and compliance teams preferably in the digital and financial services industry. • Proficiency in Agile methodologies and experience in leading Agile transformations within development teams. • Excellent leadership and communication skills, with the ability to collaborate effectively with cross-functional teams and stakeholders. • Strong problem-solving and decision-making abilities, with a proactive and results-driven approach. • Good understanding of AWS Cloud Services EC2, EKS, RDS, Redshift, Kafka services Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Data Engineer Job Type: 6 Month CONTRACT to hire Location: Hyderabad (Hybrid) iO Associates is seeking a skilled Data Engineer to design and optimize cloud-based data pipelines for a fast-growing analytics firm specializing in eCommerce. You'll work with cutting-edge technologies like Python, PySpark, and DBT to transform raw data into actionable insights for global brands. Required 5+ years' experience in data engineering with strong Python/PySpark skills Expertise in SQL, data modeling, and cloud platforms (AWS, GCP preferred) Develop and Maintain DBT Models, build modular, tested, and well-documented DBT models to transform raw data into analytics-ready datasets. Optimize SQL Transformations write efficient SQL logic within DBT models to improve performance and maintainability across large datasets. Excellent communication skills to bridge technical and business needs Key Responsibilities Design, build, and maintain scalable ETL pipelines for high-volume eCommerce data Implement efficient data warehousing solutions (Redshift, BigQuery, Snowflake) Develop robust data models and ensure data quality through validation processes Optimize pipeline performance and troubleshoot issues in cloud environments (AWS/Azure/GCP) Collaborate with analytics teams to deliver reliable datasets for business intelligence Why Apply? Work with cutting-edge data technologies in a high-growth environment Remote-first culture with flexible work arrangements Opportunity to build solutions powering data-driven decisions for global brands Collaborative team that values innovation and professional growth Join us to shape the future of eCommerce analytics! Show more Show less
Posted 1 week ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Key Responsibilities: 1. Data Analysis: Analyzing structured and unstructured data to derive actionable insights is crucial for informed decision-making and strategic planning. This ensures that the organization can leverage data to drive business outcomes and improve operational efficiency. 2. Data Architecture Design: Collaborating with the Data Engineering team to design and implement data models, pipelines, and storage solutions is essential for creating a robust data infrastructure. Defining and maintaining data architecture standards and best practices ensures consistency and reliability across data systems. Optimizing data systems for performance, scalability, and security is vital to handle growing data volumes and ensure data integrity and protection. 3. Collaboration and Stakeholder Engagement: Working with business units to understand their data needs and align them with architectural solutions ensures that data initiatives are aligned with business goals. Acting as a liaison between technical teams and business users facilitates effective communication and collaboration, ensuring that data solutions meet user requirements. 4. Data Governance and Quality: Implementing data governance practices to ensure data accuracy, consistency, and security is critical for maintaining high data quality standards. Proactively identifying and addressing data quality issues helps prevent data-related problems and ensures reliable data for analysis and reporting. Qualifications - External 1. Education: Bachelor’s or Master’s degree in Computer Science, Data Science, Information Systems, or a related field provides the foundational knowledge required for the role. 2. Experience: 6+ years as a Data Analyst, Data Architect, or similar role ensures that the candidate has the necessary experience to handle complex data tasks and responsibilities. Hands-on experience with data modeling, architecture design, and analytics tools is essential for designing effective data solutions. Proficiency in SQL and data visualization tools enables the candidate to manage and present data effectively. Experience with cloud platforms (e.g., AWS, Azure) is crucial for leveraging modern data infrastructure and services. Familiarity with data warehouse solutions (e.g., Redshift, Snowflake) ensures the candidate can design and manage scalable data storage solutions. Understanding of data governance frameworks and tools is necessary for implementing effective data governance practice Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description PayPay is looking for an experienced Cloud-Based AI and ML Engineer. This role involves leveraging cloud-based AI/ML Services to build infrastructure as well as developing, deploying, and maintaining ML models, collaborating with cross-functional teams, and ensuring scalable and efficient AI solutions particularly on Amazon Web Services (AWS). Main Responsibilities 1. Cloud Infrastructure Management : - Architect and maintain cloud infrastructure for AI/ML projects using AWS tools. - Implement best practices for security, cost management, and high-availability. - Monitor and manage cloud resources to ensure seamless operation of ML services. 2. Model Development and Deployment : - Design, develop, and deploy machine learning models using AWS services such as SageMaker. - Collaborate with data scientists and data engineers to create scalable ML workflows. - Optimize models for performance and scalability on AWS infrastructure. - Implement CI/CD pipelines to streamline and accelerate the model development and deployment process. - Set up a cloud-based development environment for data engineers and data scientists to facilitate model development and exploratory data analysis - Implement monitoring, logging, and observability to streamline operations and ensure efficient management of models deployed in production. 3. Data Management : - Work with structured and unstructured data to train robust ML models. - Use AWS data storage and processing services like S3, RDS, Redshift, or DynamoDB. - Ensure data integrity and compliance with set Security regulations and standards. 4. Collaboration and Communication : - Collaborate with cross-functional teams including DevOps, Data Engineering, and Product Management teams. - Communicate technical concepts effectively to non-technical stakeholders. 5. Continuous Improvement and Innovation : - Stay updated with the latest advancements in AI/ML technologies and AWS services. - Provide through Automations means for developers to easily develop and deploy their AI/ML models on AWS. Tech Stack - AWS: - VPC, EC2, ECS, EKS, Lambda, MWAA, RDS, ElastiCache, DynamoDB, Opensearch, S3, CloudWatch, Cognito, SQS, KMS, Secret Manager, KMS, MSK,Amazon Kinesis, CodeCommit, CodeBuild, CodeDeploy, CodePipeline, AWS Lake Formation, AWS Glue, SageMaker and other AI Services. - Terraform, Github Actions, Prometheus, Grafana, Atlantis - OSS (Administration experience on these tools) - Jupyter, MLFlow, Argo Workflows, Airflow Required Skills and Experiences - More than 5+ years of technical experience in cloud-based infrastructure with a focus on AI and ML platforms - Extensive technical hands-on experience with computing, storage, and analytical services on AWS. - Demonstrated skill in programming and scripting languages, including Python, Shell Scripting, Go, and Rust. - Experience with infrastructure as code (IAC) tools in AWS, such as Terraform, CloudFormation, and CDK. - Proficient in Linux internals and system administration. - Experience in production level infrastructure change management and releases for business-critical systems. - Experience in Cloud infrastructure and platform systems availability, performance and cost management. - Strong understanding of cloud security best practices and payment industry compliance standards. - Experience with cloud services monitoring, detection, and response, as well as performance tuning and cost control. - Familiarity with cloud infrastructure service patching and upgrades. - Excellent oral, written, and interpersonal communication skills. Preferred Qualifications - Bachelor’s degree and above in a technology related field - Experience with other cloud service providers (e.g GCP, Azure) - Experience with Kubernetes - Experience with Event-Driven Architecture (Kafka preferred) - Experience using and contributing to Open Source tools - Experience in managing IT compliance and security risk - Published papers / blogs / articles - Relevant and verifiable certifications Remarks *Please note that you cannot apply for PayPay (Japan-based jobs) or other positions in parallel or in duplicate. PayPay 5 senses Please refer PayPay 5 senses to learn what we value at work. Working Conditions Employment Status Full Time Office Location Gurugram (Wework) ※The development center requires you to work in the Gurugram office to establish the strong core team. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Java Developer Job Type: Contract (6 Months) Location: Pune Role Overview We are seeking a skilled Java Developer for a 6-month contract role based in Pune. The ideal candidate will have strong hands-on experience in Java-based enterprise application development and a solid understanding of cloud technologies. Key Responsibilities Analyze customer/internal requirements and translate them into software design documents; present RFCs to the architecture team Write clean, high-quality, maintainable code based on approved designs Conduct thorough unit and system-level testing to ensure software reliability Collaborate with cross-functional teams to analyze, design, and deliver applications Ensure optimal performance, scalability, and responsiveness of applications Take technical ownership of assigned features Provide mentorship and support to team members for resolving technical and functional issues Review and approve peer code through pull requests Must-Have Skills Frameworks/Technologies: Spring Boot, Spring AOP, Spring MVC, Hibernate, Play, REST APIs, Microservices Programming Languages: Core Java, Java 8 (streams, lambdas, fluent-style programming), J2EE Database: Strong SQL skills with the ability to write complex queries DevOps: Hands-on experience with CI/CD pipelines Cloud: Solid understanding of AWS services such as S3, Lambda, SNS, SQS, IAM Roles, Kinesis, EMR, Databricks Coding Practices: Scalable and maintainable code development; experience in cloud-native application development Nice-to-Have Skills Additional Languages/Frameworks: Golang, React, OAuth, SCIM Databases: NoSQL, Redshift AWS Tools: KMS, CloudWatch, Caching, Notification Services, Queues Candidate Requirements Proven experience in core application development Strong communication and interpersonal skills Proactive attitude with a willingness to learn new technologies and products Collaborative team player with a growth mindset Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
Remote
We are looking for top-notch SDET Mobile who will deliver on key initiatives, starting from the ideation phase all the way to development and product delivery. You Will : • Development of iOS/Android test automation on mobile test harnesses • Develop and enhance to the existing automation scripts, tools and framework using Java, TestNG and Appium • Execute automated test plans and regression tests for iOS/Android applications • Define testing strategies and scope for user stories and technical development tasks • Provide estimates on testing efforts to Product and Engineering team members • Maintain and improve the test coverage and ratio of automated test • Advocate Automated Testing and CI/CD methodology, review and advise testing methods and best practices • Identify, investigate, report, and track defects • Deliver high-quality features and infrastructure to production • Continuously learn new tools, technologies, and testing disciplines • Able to work under minimal supervision and quick to adopt new technologies • Work collaboratively across multiple teams • Communicate all concerns and status with SQA manager on timely manner Qualifications : • Bachelor’s degree in computer science or a related field or equivalent work experience • A track record of improving quality • Strong test automation experience with expertise in iOS/Android app testing • Experience on TestNG, Java, Appium, XCUITest with expertise in programming skills using Java • Experience in Selenium webdriver • Experience using Xcode instruments • Experience using BrowserStack or similar for app automation • Expertise in software QA methodologies, tools, and processes • Expertise in test design, test automation frameworks, and scripting tests • Experience with MongoDB • Experience with Git, DevOps CI/CD pipelines • Good knowledge of data warehouses, data lakes and ETL pipelines (AWS, Athena,Redshift Spectrum, Postgres, SQL, etc) is a plus • API Automation testing experience using JEST, Mocha, REST Assured or similar frameworks is a plus • Excellent communication skills, both oral and written a must • Experience with Scrum methodologies and remote teams is a plus! Big Pluses if you: • Are comfortable with collaboration, open communication and reaching across functional borders • Are self-motivated and can get things done • Have the ability to communicate and defend your ideas clearly and high availability. • Have a desire to build products that users love • Are updated on the newest technologies Please include whatever info you believe is relevant: Resume, Linkedin profile, GitHub profile etc. Show more Show less
Posted 1 week ago
0.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka
Remote
Job Title: Talend Lead Experience: 8+ Years (5–6+ years in Talend) Location: Hybrid – Bengaluru, Hyderabad, Chennai, Pune (In-office 1–2 days/week) Work Hours: 2 PM – 11 PM IST (Work from office till 6 PM, then resume remotely) Notice Period: Immediate to 30 days only (Candidates already serving notice preferred) Budget: ₹27–30 LPA (inclusive of 5% variable) Job Type: Full-time Key Responsibilities: Lead and mentor a team of Talend ETL developers (6+ months of lead/mentoring experience acceptable). Design, develop, and optimize Talend-based ETL solutions with a strong focus on performance and data quality. Collaborate with cross-functional stakeholders to define scope, timelines, and deliverables. Implement robust data integration pipelines using Talend and AWS services. Ensure smooth data flow between source and target systems such as relational databases, APIs, and flat files. Drive best practices in code, documentation, error handling, and job scheduling. Participate in project planning, troubleshooting, and technical reviews. Contribute to system integration testing, unit testing, and deployment processes. Technical Skills Required: Strong experience in Talend Studio, Talend TAC/TMC . Advanced SQL for querying and data transformations. Hands-on experience with AWS Cloud Services (S3, Redshift, EC2, Glue, Athena, etc.). Proficiency in working with data from relational and NoSQL databases , flat files, and APIs (REST/SOAP). Knowledge of ETL/ELT , data profiling, and quality checks. Familiarity with job scheduling tools and performance monitoring frameworks. Comfortable with Git , Terraform , GitLab , and VS Code . Exposure to scripting languages like Python or Shell is a plus. Preferred Qualifications: Bachelor’s in Computer Science, IT, or a related field. Talend and AWS certifications are highly desirable. Experience with US healthcare clients is a big plus. Familiarity with Agile methodology and DevOps practices . Understanding of Big Data platforms (Hadoop, Spark) and Talend Big Data modules. Important Screening Criteria: No short-term projects or employment gaps over 3 months. No candidates from JNTU. Strictly immediate joiners or candidates with up to 30 days' notice. Job Types: Full-time, Permanent Pay: ₹2,700,000.00 - ₹3,000,000.00 per year Schedule: Day shift Evening shift Monday to Friday Ability to commute/relocate: Bengaluru, Karnataka: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Required) Experience: Talend: 6 years (Required) Work Location: In person
Posted 1 week ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Description Amazon WWR&R is comprised of business, product, operational, program, software engineering and data teams that manage the life of a returned or damaged product from a customer to the warehouse and on to its next best use. Our work is broad and deep: we train machine learning models to automate routing and find signals to optimize re-use; we invent new channels to give products a second life; we develop world-class product support to help customers love what they buy; we pilot smarter product evaluations; we work from the customer backward to find ways to make the return experience remarkably delightful and easy; and we do it all while scrutinizing our business with laser focus. WWR&R data engineering team at Amazon Hyderabad Development Center is an agile team whose charter is to deliver the next generation of Reverse Logistics data lake platform. As a member of this team, your mission will be to support massively scalable, distributed data warehousing, querying, reporting and decision-support system. We support a fast-paced environment where each day brings new challenges and opportunities. As a Support Engineer, you will play a pivotal role in ensuring the stability, compliance, and operational excellence of our enterprise Data Warehouse (DW) environment. In this role, you will be responsible for monitoring and maintaining production data pipelines, proactively identifying and resolving issues that impact data quality, availability, or timeliness. You’ll collaborate closely with data engineers and cross-functional teams to troubleshoot incidents, implement scalable solutions, and enhance the overall resilience of our data infrastructure. A key aspect of this role involves supporting our data compliance and governance initiatives, ensuring systems align with internal policies and external regulatory standards such as GDPR. You will help enforce access controls, manage data retention policies, and support audit readiness through strong logging and monitoring practices. You’ll also lead efforts to automate manual support processes, improving team efficiency and reducing operational risk. Additionally, you will be responsible for maintaining clear, up-to-date documentation and runbooks for operational procedures and issue resolution, promoting consistency and knowledge sharing across the team. We’re looking for a self-motivated, quick-learning team player with a strong sense of ownership and a ‘can-do’ attitude, someone who thrives in a dynamic, high-impact environment and is eager to make meaningful contributions to our data operations. Basic Qualifications 2+ years of software development, or 2+ years of technical support experience Bachelor's degree in engineering or equivalent Experience troubleshooting and debugging technical systems Experience scripting in modern program languages Experience with SQL databases (querying and analyzing) Preferred Qualifications Good to have experience with AWS technologies stack including Redshift, RDS, S3, EMR or similar solutions build around Hive/Spark etc Good to have experience with reporting tools like Tableau, OBIEE or other BI packages. Knowledge of software engineering best practices across the development lifecycle is a plus Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI HYD 13 SEZ Job ID: A3005460 Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Description Are you a highly skilled data engineer and project leader? Do you think big, enjoy complexity and building solutions that scale? Are you curious to know what you could achieve in a company that pushes the boundaries of modern technology? If you answered yes and you have a background in FinTech you’ll love this role and Amazon’s data obsessed culture. Amazon Devices and Services Fintech is the global team that designs and builds the financial planning and analysis tools for wide variety of Amazon’s new and established organizations. From Kindle to Ring and even new and exciting companies like Kuiper (our new interstellar satellite play) this team enjoys a wide variety of complex and interesting problem spaces. They are almost like FinTech consultants embedded in Amazon. This team are looking for a Data Engineer to build and enhance the businesses finance systems with TM1 at its core. You will manage all aspects from requirements gathering, technical design, development, deployment, and integration to solve budgeting, planning, performance management and reporting challenges Key job responsibilities Design and implement next generation financial solutions assisted by almost unlimited access to AWS resources including EC2, RDS, Redshift, Stepfunctions, EMR, Lambda and 3rd party software TM1. Build and deliver high quality data pipelines capable of scaling from running for a single month of data during month end close to 150 and more months when doing restatements. Continually improve ongoing reporting and analysis processes and infrastructure, automating or simplifying self-service capabilities for customers. Dive deep to resolve problems at their root, looking for failure patterns and suggesting and implementing fixes or enhancements. Prepare runbooks, methods of procedures, tutorials, training videos on best practices for global delivery. Solve unique challenges presented by the massive data volume and diverse data sets working for one of the largest companies in the wo Basic Qualifications - 5+ years data engineering experience. - Extensive experience writing SQL queries and stored procedures. - Experience with big data tools and distributed computing. - Finance experience, exhibiting knowledge of financial reporting, budgeting and forecasting functions and processes. - Bachelors degree. Preferred Qualifications Experience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases) - Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions. - Experience with programming languages such as python, java shell scripts. - Experience with IBM Planning Analytics/TM1 both scripting processes and writing rules. - Experience with design & delivery of formal training curriculum and programs. - Project management, scoping, reporting, and scheduling experience. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - Haryana Job ID: A3005332 Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
Indore, Madhya Pradesh, India
On-site
We are seeking a highly skilled and experienced Lead Data Engineer (7+ years) to join our dynamic team. As a Lead Data Engineer, you will play a crucial role in designing, developing, and maintaining our data infrastructure. You will be responsible for ensuring the efficient and reliable collection, storage, and transformation of large-scale data to support business intelligence, analytics, and data-driven decision-making. Key Responsibilities Data Architecture & Design : Lead the design and implementation of robust data architectures that support data warehousing (DWH), data integration, and analytics platforms. Develop and maintain ETL (Extract, Transform, Load) pipelines to ensure the efficient processing of large datasets. ETL Development Design, develop, and optimize ETL processes using tools like Informatica Power Center, Intelligent Data Management Cloud (IDMC), or custom Python scripts. Implement data transformation and cleansing processes to ensure data quality and consistency across the enterprise. Data Warehouse Development Build and maintain scalable data warehouse solutions using Snowflake, Databricks, Redshift, or similar technologies. Ensure efficient storage, retrieval, and processing of structured and semi-structured data. Big Data & Cloud Technologies Utilize AWS Glue and PySpark for large-scale data processing and transformation. Implement and manage data pipelines using Apache Airflow for orchestration and scheduling. Leverage cloud platforms (AWS, Azure, GCP) for data storage, processing, and analytics. Data Management & Governance Establish and enforce data governance and security best practices. Ensure data integrity, accuracy, and availability across all data platforms. Implement monitoring and alerting systems to ensure data pipeline reliability. Collaboration & Leadership Work closely with data Stewards, analysts, and business stakeholders to understand data requirements and deliver solutions that meet business needs. Mentor and guide junior data engineers, fostering a culture of continuous learning and development within the team. Lead data-related projects from inception to delivery, ensuring alignment with business objectives and timelines. Database Management Design and manage relational databases (RDBMS) to support transactional and analytical workloads. Optimize SQL queries for performance and scalability across various database platforms. Required Skills & Qualifications Education: Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. Experience Minimum of 7+ years of experience in data engineering, ETL, and data warehouse development. Proven experience with ETL tools like Informatica Power Center or IDMC. Strong proficiency in Python and PySpark for data processing. Experience with cloud-based data platforms such as AWS Glue, Snowflake, Databricks, or Redshift. Hands-on experience with SQL and RDBMS platforms (e.g., Oracle, MySQL, PostgreSQL). Familiarity with data orchestration tools like Apache Airflow. Technical Skills Advanced knowledge of data warehousing concepts and best practices. Strong understanding of data modeling, schema design, and data governance. Proficiency in designing and implementing scalable ETL pipelines. Experience with cloud infrastructure (AWS, Azure, GCP) for data storage and processing. Soft Skills Excellent communication and collaboration skills. Ability to lead and mentor a team of engineers. Strong problem-solving and analytical thinking abilities. Ability to manage multiple projects and prioritize tasks effectively. Preferred Qualifications Experience with machine learning workflows and data science tools. Certification in AWS, Snowflake, Databricks, or relevant data engineering technologies. Experience with Agile methodologies and DevOps practices. (ref:hirist.tech) Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description We are looking for a skilled and motivated Senior Data Engineer to join data integration and analytics team. The ideal candidate will have hands-on experience with Informatica IICS, AWS Redshift, Python scripting, and Unix/Linux systems. You will be responsible for building and maintaining scalable ETL pipelines to support business intelligence and analytics needs. A strong passion for continuous learning, problem-solving, and enabling data-driven decision-making is highly valued. Primary Skills : Informatica Skills : Description : We are looking for a Senior Data Engineer to lead the design, development, and management of scalable data platforms and pipelines. This role demands a strong technical foundation in data architecture, big data technologies, and database systems (both SQL and NoSQL), along with the ability to work across functional teams to deliver robust, secure, and high-performing data solutions. Role Responsibility Design, develop, and maintain end-toend data pipelines and infrastructure. Translate business and functional requirements into scalable, welldocumented technical solutions. Build and manage data flows across structured and unstructured data sources, including streaming and batch integrations. Ensure data integrity and quality through automated validations, unit testing, and robust documentation. Optimize data processing performance and manage large datasets efficiently Collaborate closely with stakeholders and project teams to align data solutions with business objectives. Implement and maintain security and privacy protocols to ensure safe data handling. Lead development environment setup and configuration of tools and services. Mentor junior data engineers and contribute to continuous improvement and automation initiatives. Coordinate with QA and UAT teams during testing and release phases Role Requirement Strong proficiency in SQL (including procedures, performance tuning, and analytical functions). Solid understanding of data warehousing concepts, including dimensional modeling and SCDs. Hands-on experience with scripting languages (Shell / PowerShell). Familiarity with Cloud and Big data technologies. Experience working with relational, non-relational databases, and data streaming systems. Proficiency in data profiling, validation, and testing practices. Excellent problem-solving, communication (written and verbal), and documentation skills. Exposure to Agile methodologies and CI/CD practices. Self-motivated, adaptable, and capable of working in a fast-paced Requirement : Overall 5 years and 3+ years of hands-on experience with Informatica IICS (Cloud Data Integration, Application Integration). Strong proficiency in AWS Redshift and writing complex SQL queries. Solid programming experience in Python for scripting, data wrangling, and automation. Experience with version control tools like Git and CI/CD workflows. Knowledge of data modeling and data warehousing concepts. Prior experience with data lakes and big data technologies is a plus (ref:hirist.tech) Show more Show less
Posted 1 week ago
12.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title : Technical Delivery Manager Experience : 12+ years Location : Pune, Kharadi Employment Type : Full-time Job Summary We are seeking a seasoned Technical Delivery Manager with 12+ years of experience to lead and manage large-scale, complex programs. The ideal candidate will have a strong background in project and delivery management, with expertise in Agile methodologies, risk management, stakeholder communication, and cross-functional team leadership. Key Responsibilities Delivery & Execution: Oversee end-to-end project execution, ensuring alignment with business objectives, timelines, and quality standards. Agile & SCRUM Management: Drive Agile project delivery, coordinating across multiple teams and ensuring adherence to best practices. Risk & Dependency Management: Participate in design discussions to identify risks, dependencies, and mitigation strategies. Stakeholder Communication: Report and present program progress to senior management and executive leadership. Client Engagement: Lead customer presentations, articulations, and discussions, ensuring effective communication and alignment. Cross-Team Coordination: Collaborate with globally distributed teams to ensure seamless integration and delivery. Leadership & People Management: Guide, mentor, and motivate diverse teams, fostering a culture of innovation and excellence. Tool & Process Management: Utilize tools like JIRA, Confluence, MPP, and Smartsheet to drive project visibility and efficiency. Engineering & ALM Best Practices: Ensure adherence to engineering and Application Lifecycle Management (ALM) best practices for continuous improvement. Required Skills & Qualifications 12+ years of experience in IT project and delivery management. 5+ years of project management experience, preferably in Managed Services and Fixed-Price engagements. Proven experience in large-scale program implementation. Strong expertise in Agile/SCRUM methodologies and project execution. Excellent problem-solving, analytical, and risk management skills. Outstanding communication, articulation, and presentation skills. Experience in multi-team and cross-functional coordination, especially across different time zones. Good-to-Have Skills Exposure to AWS services (S3, Glue, Lambda, SNS, RDS MySQL, Redshift, Snowflake). Knowledge of Python, Jinja, Angular, APIs, Power BI, SageMaker, Flutter Dart. (ref:hirist.tech) Show more Show less
Posted 1 week ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Technical Project Manager IT management professional with 10+ years of Exp Responsibilities 5+ years of project management experience Delivery executive experience (Managed Services, Fixed prices) Delivery management exp in working on executing projects extensively using SCRUM methodology. Planning, monitoring and risk management for multiple data and data science programs. Participating in design discussions to capture risks & dependencies. Reporting & presenting program progress to senior management Customer presentation, articulations, and communication skills Co-ordinate and integrate with multiple teams in different time zones. Leading, guiding, managing, and motivating diverse team Should have handled large & complex program implementation. Knowledge of working on tools like JIRA, Confluence, MPP, Smartsheet etc. Knowledge of Engineering and ALM best practices Good To Have Knowledge of AWS S3, Glue, Lambda, SNS etc., Python, Jinja, Angular, APIs, PowerBI, Sagemaker, Flutter Dart, RDS MySQL, DB Redshift, Snowflake. (ref:hirist.tech) Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description and Requirements "At BMC trust is not just a word - it's a way of life!" Description And Requirements CareerArc Code CA-DN Hybrid "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! BU Description We are the Technology and Automation team that drives competitive Advantage for BMC by enabling recurring revenue growth, customer centricity, operational efficiency and transformation through actionable insights, focused operational execution, and obsessive value realization. About You You are a self-motivated, proactive individual who thrives in a fast-paced environment. You have a strong eagerness to learn and grow, continuously staying updated with the latest trends and technologies in data engineering. Your passion for collaboration makes you a valuable team player, contributing to a positive work culture while also guiding and mentoring junior team members. You’re excited about problem-solving and have the ability to take ownership of projects from start to finish. With a keen interest in data-driven decision-making, you are ready to work on cutting-edge solutions that have a direct impact on the business. Role And Responsibilities As a Data Engineer, you will play a crucial role in leading and managing strategic data initiatives across the business. Your responsibilities will include: Leading data engineering projects across key business functions, including Marketing, Sales, Customer Success, and Product R&D. Developing and maintaining data pipelines to extract, transform, and load (ETL) data into data warehouses or data lakes. Designing and implementing ETL processes, ensuring the integrity, scalability, and performance of the data architecture. Leading data modeling efforts, ensuring that data is structured for optimal performance and that security best practices are maintained. Collaborating with data scientists, analysts, and stakeholders to understand data requirements and provide valuable insights across the customer journey. Guiding and mentoring junior engineers, providing technical leadership and ensuring best practices are followed. Maintaining documentation for data structures, ETL processes, and data lineage, ensuring clarity and ease of understanding across the team. Developing and maintaining data security, compliance, and retention protocols as part of best practice initiatives. Professional Expertise Must-Have Skills 5+ years of experience in data engineering, data warehousing, and building enterprise-level data integrations. Proficiency in SQL, including query optimization and tuning for relational databases (Snowflake, MS SQL Server, RedShift, etc.). 2+ years of experience working with cloud platforms (AWS, GCP, Azure, or OCI). Expertise in Python and Spark for data extraction, manipulation, and data pipeline development. Experience with structured, semi-structured, and unstructured data formats (JSON, XML, Parquet, CSV). Familiarity with version control systems (Git, Bitbucket) and Agile methodologies (Jira). Ability to collaborate with data scientists and business analysts, providing data support and insights. Proven ability to work effectively in a team setting, balancing multiple projects, and leading initiatives. Nice-to-Have Skills Experience in the SaaS software industry. Knowledge of analytics governance, data literacy, and core visualization tools (Tableau, MicroStrategy). Familiarity with CRM and marketing automation tools (Salesforce, HubSpot, Eloqua). Education Bachelor’s or master’s degree in computer science, Information Systems, or a related field (Advanced degree preferred). BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 2,033,200 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Experience : 8-10 years Job Title : Devops Engineer Location : Gurugram Job Summary We are seeking a highly skilled and experienced Lead DevOps Engineer to drive the design, automation, and maintenance of secure and scalable cloud infrastructure. The ideal candidate will have deep technical expertise in cloud platforms (AWS/GCP), container orchestration, CI/CD pipelines, and DevSecOps practices. You will be responsible for leading infrastructure initiatives, mentoring team members, and collaborating closely with software and QA teams to enable high-quality, rapid software delivery. Key Responsibilities Cloud Infrastructure & Automation : Design, deploy, and manage secure, scalable cloud environments using AWS, GCP, or similar platforms. Develop Infrastructure-as-Code (IaC) using Terraform for consistent resource provisioning. Implement and manage CI/CD pipelines using tools like Jenkins, GitLab CI/CD, GitHub Actions, Bitbucket Pipelines, AWS CodePipeline, or Azure DevOps. Containerization & Orchestration : Containerize applications using Docker for seamless development and deployment. Manage and scale Kubernetes clusters (on-premise or cloud-managed like AWS EKS). Monitor and optimize container environments for performance, scalability, and cost-efficiency. Security & Compliance : Enforce cloud security best practices including IAM policies, VPC design, and secure secrets management (e.g., AWS Secrets Manager). Conduct regular vulnerability assessments, security scans, and implement remediation plans. Ensure infrastructure compliance with industry standards and manage incident response protocols. Monitoring & Optimization : Set up and maintain monitoring/observability systems (e.g., Grafana, Prometheus, AWS CloudWatch, Datadog, New Relic). Analyze logs and metrics to troubleshoot issues and improve system performance. Optimize resource utilization and cloud spend through continuous review of infrastructure configurations. Scripting & Tooling : Develop automation scripts (Shell/Python) for environment provisioning, deployments, backups, and log management. Maintain and enhance CI/CD workflows to ensure efficient and stable deployments. Collaboration & Leadership : Collaborate with engineering and QA teams to ensure infrastructure aligns with development needs. Mentor junior DevOps engineers, fostering a culture of continuous learning and improvement. Communicate technical concepts effectively to both technical and non-technical : Education Bachelor's degree in Computer Science, Engineering, or a related technical field, or equivalent hands-on : AWS Certified DevOps Engineer Professional (preferred) or other relevant cloud : 8+ years of experience in DevOps or Cloud Infrastructure roles, including at least 3 years in a leadership capacity. Strong hands-on expertise in AWS (ECS, EKS, RDS, S3, Lambda, CodePipeline) or GCP equivalents. Proven experience with CI/CD tools: Jenkins, GitLab CI/CD, GitHub Actions, Bitbucket Pipelines, Azure DevOps. Advanced knowledge of Docker and Kubernetes ecosystem. Skilled in Infrastructure-as-Code (Terraform) and configuration management tools like Ansible. Proficient in scripting (Shell, Python) for automation and tooling. Experience implementing DevSecOps practices and advanced security configurations. Exposure to data tools (e.g., Apache Superset, AWS Athena, Redshift) is a plus. Soft Skills Strong problem-solving abilities and capacity to work under pressure. Excellent communication and team collaboration. Organized with attention to detail and a commitment to Skills : Experience with alternative cloud platforms (e.g., Oracle Cloud, DigitalOcean). Familiarity with advanced observability stacks (Grafana, Prometheus, Loki, Datadog). (ref:hirist.tech) Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Job Title: Databricks Dashboard Engineer Job Summary We are looking for a versatile Databricks Dashboard Engineer with strong coding skills in SQL who can design and build interactive dashboards as well as contribute to data engineering efforts. Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities Design, develop, and maintain interactive dashboards and visualizations using Databricks SQL, Delta Lake, and Notebooks. Collaborate with business stakeholders to gather dashboard requirements and deliver actionable insights. Optimize data models and queries for performance and scalability. Integrate Databricks data with BI tools such as Power BI, Tableau, or Looker. Automate dashboard refreshes and monitor data quality. Maintain comprehensive documentation for dashboards. Work closely with data engineers and analysts to ensure data governance and reliability. Stay current with Databricks platform capabilities and dashboarding best practices Design, develop, test, and deploy data model and dashboard processes (batch or real-time) using tools such as Databricks, PowerBI etc. Create functional & technical documentation – e.g. data model architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data model, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data modelling and dashboarding Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: Databricks, SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications 8 Years industry implementation experience with data warehousing tools such as AWS services Redshift, Synapse, Databricks, Power BI, Tableau, Qlik, Looker etc. 3+ years of experience in databricks dashboard development 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Proficient in SQL, data modeling, and query optimization Experience with Databricks SQL, Delta Lake, and notebook development. Familiarity with BI visualization tools like Power BI, Tableau, or Looker. Understanding of data warehousing, ETL/ELT pipelines, and cloud data platforms Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title : Lead Full Stack / Senior Full Stack Developer. Experience : 5+ years. Location : Noida (Work from Office). Job Overview We are seeking a highly skilled candidate with expertise in NodeJS, NestJS, React.js, Next.js, MySQL, Redshift, NoSQL, System Design, and Architecture. The ideal candidate will have strong workflow design and implementation skills, experience with queueing, caching, scalability, microservices, and AWS, and team leadership experience to manage a team of 10 developers. Knowledge of React Native and automation testing would be an added advantage. Key Responsibilities Architect and develop scalable backend systems using NestJS with a focus on high performance. Lead a team of developers, ensuring adherence to best practices and Agile methodologies. Work with databases including MySQL and NoSQL to ensure data integrity and performance. Optimize system design for scalability, caching, and queueing mechanisms. Collaborate with the frontend team working on Next.js and ensure seamless integration. Ensure robust microservices architecture with proper API design and inter-service communication. Work in an Agile environment, driving sprints, standups, and ensuring timely delivery of projects. Required Skills & Experience Experience in software development, System Design and System Architecture. Strong expertise in NodeJS, NestJS, React.js, Next.js, MySQL, Redshift, NoSQL, AWS, and Microservices Architecture. Expertise in queueing mechanisms, caching, scalability, and system performance optimization. Good to have knowledge of React Native and Automation Testing. Strong leadership and management skills with experience in leading development teams. Proficiency in Agile methodologies and sprint planning. Excellent problem-solving skills and ability to work under pressure. Qualifications : B.E / B.Tech (ref:hirist.tech) Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Job Description Mactores is seeking an AWS Data Engineer (Senior) to join our team. The ideal candidate will have extensive experience in PySpark and SQL and have worked with data pipelines using Amazon EMR or Amazon Glue. The candidate must also have experience in data modeling and end-user querying using Amazon Redshift or Snowflake, Amazon Athena, Presto, and orchestration experience using Airflow. Responsibilities Develop and maintain data pipelines using Amazon EMR or Amazon Glue. Create data models and end-user querying using Amazon Redshift or Snowflake, Amazon Athena, and Presto. Build and maintain the orchestration of data pipelines using Airflow. Collaborate with other teams to understand their data needs and help design solutions. Troubleshoot and optimize data pipelines and data models. Write and maintain PySpark and SQL scripts to extract, transform, and load data. Document and communicate technical solutions to both technical and non-technical audiences. Stay up-to-date with new AWS data technologies and evaluate their impact on our existing systems. Requirements Bachelor's degree in Computer Science, Engineering, or a related field. 3+ years of experience working with PySpark and SQL. 2+ years of experience building and maintaining data pipelines using Amazon EMR or Amazon Glue. 2+ years of experience with data modeling and end-user querying using Amazon Redshift or Snowflake, Amazon Athena, and Presto. 1+ years of experience building and maintaining the orchestration of data pipelines using Airflow. Strong problem-solving and troubleshooting skills. Excellent communication and collaboration skills. Ability to work independently and within a team environment. You Are Preferred If You Have AWS Data Analytics Specialty Certification. Experience with Agile development methodology. (ref:hirist.tech) Show more Show less
Posted 1 week ago
0 years
0 Lacs
Greater Kolkata Area
On-site
We are seeking a talented Looker Expert with strong knowledge of PostgreSQL functions to join our team. In this role, you will utilize your expertise in Looker and PostgreSQL to design robust data models, create advanced visualizations, and optimize queries to drive business insights. You will collaborate closely with cross-functional teams to ensure that data is accessible, accurate, and actionable for decision-making. Key Responsibilities Looker Development : Design and develop LookML models, explores, and dashboards. Build user-friendly and scalable Looker solutions to meet business intelligence needs. PostgreSQL Expertise : Write and optimize complex PostgreSQL queries and functions, including stored procedures, triggers, and views. Leverage PostgreSQL capabilities for efficient data transformations and analytics. Data Modeling and Optimization : Develop and maintain data models in Looker using best practices for dimensional modeling. Optimize Looker dashboards and PostgreSQL queries for performance and scalability. Collaboration : Partner with data engineers to ensure data pipelines align with analytical requirements. Work with business teams to gather requirements and deliver insights-driven solutions. Quality Assurance and Maintenance : Validate data for accuracy and consistency in Looker visualizations and reports. Monitor and troubleshoot Looker dashboards and PostgreSQL performance issues. Documentation and Training : Document PostgreSQL functions, LookML configurations, and data workflows. Train users on self-service analytics and Looker capabilities. Requirements Technical Expertise : Proven experience in Looker and LookML development. Strong proficiency in PostgreSQL, including advanced functions, triggers, and stored procedures. Expertise in writing, debugging, and optimizing SQL queries. Familiarity with data warehouse technologies (e.g., Snowflake, BigQuery, Redshift). Data Knowledge : Understanding of relational databases, star schema, and data modeling principles. Experience working with large datasets and designing efficient transformations. Problem-Solving Skills : Ability to translate complex business requirements into actionable technical solutions. Analytical mindset to identify trends and insights within data. Soft Skills : Strong communication skills to work effectively with both technical and non-technical stakeholders. Ability to multitask and prioritize in a dynamic environment. Preferred Qualifications Experience with ETL tools and processes. Knowledge of data pipelines & ETL (extract, transform, and load). Familiarity with other BI tools (e.g., Tableau, Power BI). Knowledge of cloud-based data solutions (e.g., AWS, GCP, Azure). Looker and PostgreSQL certifications are a plus (ref:hirist.tech) Show more Show less
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for redshift professionals in India is growing rapidly as more companies adopt cloud data warehousing solutions. Redshift, a powerful data warehouse service provided by Amazon Web Services, is in high demand due to its scalability, performance, and cost-effectiveness. Job seekers with expertise in redshift can find a plethora of opportunities in various industries across the country.
The average salary range for redshift professionals in India varies based on experience and location. Entry-level positions can expect a salary in the range of INR 6-10 lakhs per annum, while experienced professionals can earn upwards of INR 20 lakhs per annum.
In the field of redshift, a typical career path may include roles such as: - Junior Developer - Data Engineer - Senior Data Engineer - Tech Lead - Data Architect
Apart from expertise in redshift, proficiency in the following skills can be beneficial: - SQL - ETL Tools - Data Modeling - Cloud Computing (AWS) - Python/R Programming
As the demand for redshift professionals continues to rise in India, job seekers should focus on honing their skills and knowledge in this area to stay competitive in the job market. By preparing thoroughly and showcasing their expertise, candidates can secure rewarding opportunities in this fast-growing field. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.