Jobs
Interviews

3632 Redshift Jobs - Page 42

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

22 - 37 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Inviting applications for the role of Senior Principal Consultant-Data Engineer, AWS! Locations Bangalore, Hyderabad, Kolkata Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Masters Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process

Posted 3 weeks ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Description Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities Design and develop highly available dashboards and metrics using SQL and Excel/Tableau Execute high priority (i.e. cross functional, high impact) projects to create robust, scalable analytics solutions and frameworks with the help of Analytics/BIE managers Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area Creates and maintains comprehensive business documentation including user stories, acceptance criteria, and process flows that help the BIE understand the context for developing ETL processes and visualization solutions. Performs user acceptance testing and business validation of delivered dashboards and reports, ensuring that BIE-created solutions meet actual operational needs and can be effectively utilized by site managers and operations teams. Monitors business performance metrics and operational KPIs to proactively identify emerging analytical requirements, working with BIEs to rapidly develop solutions that address real-time operational challenges in the dynamic AI-enhanced fulfillment environment. About The Team The Global Operations – Artificial Intelligence (GO-AI) team remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs including Nike IDS, Proteus, Sparrow and other new initiatives in partnership with global technology and operations teams. Basic Qualifications Experience defining requirements and using data and metrics to draw business insights Knowledge of SQL Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages Knowledge of Python, VBA, Macros, Selenium scripts 1+ year of experience working in Analytics / Business Intelligence environment with prior experience of design and execution of analytical projects Preferred Qualifications Experience in using AI tools Experience in Amazon Redshift and other AWS technologies for large datasets Analytical mindset and ability to see the big picture and influence others Detail-oriented and must have an aptitude for solving unstructured problems. The role will require the ability to extract data from various sources and to design/construct/execute complex analyses to finally come up with data/reports that help solve the business problem Good oral, written and presentation skills combined with the ability to be part of group discussions and explaining complex solutions Ability to apply analytical, computer, statistical and quantitative problem solving skills is required Ability to work effectively in a multi-task, high volume environment Ability to be adaptable and flexible in responding to deadlines and workflow fluctuations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A3027310

Posted 3 weeks ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Description Want to join the Earth’s most customer centric company? Do you like to dive deep to understand problems? Are you someone who likes to challenge Status Quo? Do you strive to excel at goals assigned to you? If yes, we have opportunities for you. Global Operations – Artificial Intelligence (GO-AI) at Amazon is looking to hire candidates who can excel in a fast-paced dynamic environment. Are you somebody that likes to use and analyze big data to drive business decisions? Do you enjoy converting data into insights that will be used to enhance customer decisions worldwide for business leaders? Do you want to be part of the data team which measures the pulse of innovative machine vision-based projects? If your answer is yes, join our team. GO-AI is looking for a motivated individual with strong skills and experience in resource utilization planning, process optimization and execution of scalable and robust operational mechanisms, to join the GO-AI Ops DnA team. In this position you will be responsible for supporting our sites to build solutions for the rapidly expanding GO-AI team. The role requires the ability to work with a variety of key stakeholders across job functions with multiple sites. We are looking for an entrepreneurial and analytical program manager, who is passionate about their work, understands how to manage service levels across multiple skills/programs, and who is willing to move fast and experiment often. Key job responsibilities Design and develop highly available dashboards and metrics using SQL and Excel/Tableau Execute high priority (i.e. cross functional, high impact) projects to create robust, scalable analytics solutions and frameworks with the help of Analytics/BIE managers Work closely with internal stakeholders such as business teams, engineering teams, and partner teams and align them with respect to your focus area Creates and maintains comprehensive business documentation including user stories, acceptance criteria, and process flows that help the BIE understand the context for developing ETL processes and visualization solutions. Performs user acceptance testing and business validation of delivered dashboards and reports, ensuring that BIE-created solutions meet actual operational needs and can be effectively utilized by site managers and operations teams. Monitors business performance metrics and operational KPIs to proactively identify emerging analytical requirements, working with BIEs to rapidly develop solutions that address real-time operational challenges in the dynamic AI-enhanced fulfillment environment. About The Team The Global Operations – Artificial Intelligence (GO-AI) team remotely handles exceptions in the Amazon Robotic Fulfillment Centers Globally. GO-AI seeks to complement automated vision based decision-making technologies by providing remote human support for the subset of tasks which require higher cognitive ability and cannot be processed through automated decision making with high confidence. This team provides end-to-end solutions through inbuilt competencies of Operations and strong central specialized teams to deliver programs at Amazon scale. It is operating multiple programs including Nike IDS, Proteus, Sparrow and other new initiatives in partnership with global technology and operations teams. Basic Qualifications Experience defining requirements and using data and metrics to draw business insights Knowledge of SQL Knowledge of data visualization tools such as Quick Sight, Tableau, Power BI or other BI packages Knowledge of Python, VBA, Macros, Selenium scripts 1+ year of experience working in Analytics / Business Intelligence environment with prior experience of design and execution of analytical projects Preferred Qualifications Experience in using AI tools Experience in Amazon Redshift and other AWS technologies for large datasets Analytical mindset and ability to see the big picture and influence others Detail-oriented and must have an aptitude for solving unstructured problems. The role will require the ability to extract data from various sources and to design/construct/execute complex analyses to finally come up with data/reports that help solve the business problem Good oral, written and presentation skills combined with the ability to be part of group discussions and explaining complex solutions Ability to apply analytical, computer, statistical and quantitative problem solving skills is required Ability to work effectively in a multi-task, high volume environment Ability to be adaptable and flexible in responding to deadlines and workflow fluctuations Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - Amazon Dev Center India - Hyderabad Job ID: A3027313

Posted 3 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Informatica IDMC Developer Skills: Informatica Intelligent Data Management Cloud (IDMC/IICS), SQL, AWS, Azure, or GCP, CI/CD pipelines, Snowflake, Redshift, or BigQuery. Experience Required: 5 - 8 Years Job Location: Greater Noida Only Send your CV to Gaurav.2.Kumar@coforge.com We at Coforge are hiring Informatica IDMC Developer with following skillset: Key Responsibilities: Design, develop, and maintain robust ETL pipelines using Informatica IDMC (IICS) . Collaborate with data architects, analysts, and business stakeholders to gather and understand data requirements. Integrate data from diverse sources including databases, APIs, and flat files. Optimize data workflows for performance, scalability, and reliability. Monitor and troubleshoot ETL jobs and resolve data quality issues. Implement data governance and security best practices. Maintain comprehensive documentation of data flows, transformations, and architecture. Participate in code reviews and contribute to continuous improvement initiatives. Required Skills & Qualifications: Strong hands-on experience with Informatica IDMC (IICS) and cloud-based ETL tools. Proficiency in SQL and experience with relational databases such as Oracle, SQL Server, PostgreSQL . Experience working with cloud platforms like AWS, Azure, or GCP . Familiarity with data warehousing concepts and tools such as Snowflake, Redshift, or BigQuery . Excellent problem-solving abilities and strong communication skills. Preferred Qualifications: Experience with CI/CD pipelines and version control systems. Knowledge of data modeling and metadata management. Certification in Informatica or cloud platforms is a plus.

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Delhi, India

On-site

What do you need to know about us? M+C Saatchi Performance is an award-winning global digital media agency, connecting brands to people. We deliver business growth for our clients through effective, measurable, and evolving digital media strategies. Position Title : Analyst- Reporting & QA Department : Reporting & QA Location : [New Delhi / Hybrid options] About the Role: We are looking for a highly skilled Analyst- Reporting & QA with a deep understanding of digital and mobile media to join our Reporting and QA team. This role will focus on enabling our clients to meet their media goals by ensuring data accuracy and delivering actionable insights into media performance through our reporting tools. The ideal candidate will have strong technical skills, be detail-oriented, and have experience in digital/mobile media attribution and reporting. Core Responsibilities: ETL & Data Automation : Use Matillion to streamline data processes, ensuring efficient and reliable data integration across all reporting systems. Data Quality Assurance : Verify and validate data accuracy within Power BI dashboards, proactively identifying and addressing discrepancies to maintain high data integrity. Dashboard Development : Build, maintain, and optimize Power BI dashboards to deliver real-time insights that help clients understand the performance of their digital and mobile media campaigns. Media Performance Insights : Collaborate closely with media teams to interpret data, uncover trends, and provide actionable insights that support clients in optimizing their media investments. Industry Expertise : Apply in-depth knowledge of digital and mobile media, attribution models, and reporting frameworks to deliver valuable perspectives on media performance. Tools & Platforms Expertise : Utilize tools such as GA4, platform reporting systems, first-party data analytics, and mobile measurement partners (MMPs) to support comprehensive media insights for clients. Qualifications and Experience: Education : Bachelor’s degree in Statistics, Data Science, Computer Science, Marketing, or a related field. Experience : 4-6 years in a similar role, with substantial exposure to data analysis, reporting, and the digital/mobile media landscape. Technical Skills : Proficiency in ETL tools (preferably Matillion), Power BI, and data quality control. Industry Knowledge : Strong understanding of digital and mobile media, with familiarity in attribution, reporting practices, and performance metrics. Analytical Skills : Skilled in interpreting complex data, generating actionable insights, and presenting findings effectively to non-technical stakeholders. Communication : Excellent communicator with a proven ability to collaborate effectively across cross-functional teams and with clients. Tools & Platforms : Proficiency in GA4, platform reporting, first-party data analysis, and mobile measurement partners (MMPs). Desired Skills: Background in a media agency environment. Experience with cloud-based data platforms (e.g., AWS, Redshift) preferred. Experience with Power BI is must. Strong collaboration skills and the ability to work independently. What Can you look forward to Being a part of the world’s largest independent advertising holding group. Family Health Insurance Coverage. Flexible Working Hours. Regular events including Reece Lunch & indoor games. Employee Training/Learning Programs About M+C Saatchi Performance M+C Saatchi Performance has pledged its commitment to create a company that values difference, with an inclusive culture. As part of this, M+C Saatchi Performance continues to be an Equal Opportunity Employer which does not and shall not discriminate, celebrates diversity and bases all hiring and promotion decisions solely on merit, without regard for any personal characteristics. All employee information is kept confidential according to General Data Protection Regulation (GDPR).

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Delhi, India

On-site

What do you need to know about us? M+C Saatchi Performance is an award-winning global digital media agency, connecting brands to people. We deliver business growth for our clients through effective, measurable, and evolving digital media strategies. Position Title : Analyst- Reporting & QA Department : Reporting & QA Location : New Delhi - Hybrid About the Role: We are looking for a highly skilled Analyst- Reporting & QA with a deep understanding of digital and mobile media to join our Reporting and QA team. This role will focus on enabling our clients to meet their media goals by ensuring data accuracy and delivering actionable insights into media performance through our reporting tools. The ideal candidate will have strong technical skills, be detail-oriented, and have experience in digital/mobile media attribution and reporting. Core Responsibilities: ETL & Data Automation : Use Matillion to streamline data processes, ensuring efficient and reliable data integration across all reporting systems. Data Quality Assurance : Verify and validate data accuracy within Power BI dashboards, proactively identifying and addressing discrepancies to maintain high data integrity. Dashboard Development : Build, maintain, and optimize Power BI dashboards to deliver real-time insights that help clients understand the performance of their digital and mobile media campaigns. Media Performance Insights : Collaborate closely with media teams to interpret data, uncover trends, and provide actionable insights that support clients in optimizing their media investments. Industry Expertise : Apply in-depth knowledge of digital and mobile media, attribution models, and reporting frameworks to deliver valuable perspectives on media performance. Tools & Platforms Expertise : Utilize tools such as GA4, platform reporting systems, first-party data analytics, and mobile measurement partners (MMPs) to support comprehensive media insights for clients. Qualifications and Experience: Education : Bachelor’s degree in Statistics, Data Science, Computer Science, Marketing, or a related field. Experience : 4-6 years in a similar role, with substantial exposure to data analysis, reporting, and the digital/mobile media landscape. Technical Skills : Proficiency in ETL tools (preferably Matillion), Power BI, and data quality control. Industry Knowledge : Strong understanding of digital and mobile media, with familiarity in attribution, reporting practices, and performance metrics. Analytical Skills : Skilled in interpreting complex data, generating actionable insights, and presenting findings effectively to non-technical stakeholders. Communication : Excellent communicator with a proven ability to collaborate effectively across cross-functional teams and with clients. Tools & Platforms : Proficiency in GA4, platform reporting, first-party data analysis, and mobile measurement partners (MMPs). Desired Skills: Background in a media agency environment. Experience with cloud-based data platforms (e.g., AWS, Redshift) preferred. Experience with Power BI is must. Strong collaboration skills and the ability to work independently. What Can you look forward to Being a part of the world’s largest independent advertising holding group. Family Health Insurance Coverage. Flexible Working Hours. Regular events including Reece Lunch & indoor games. Employee Training/Learning Programs About M+C Saatchi Performance M+C Saatchi Performance has pledged its commitment to create a company that values difference, with an inclusive culture. As part of this, M+C Saatchi Performance continues to be an Equal Opportunity Employer which does not and shall not discriminate, celebrates diversity and bases all hiring and promotion decisions solely on merit, without regard for any personal characteristics. All employee information is kept confidential according to General Data Protection Regulation (GDPR).

Posted 3 weeks ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Bengaluru

Work from Office

Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

Remote

We are looking for a Test Engineer who will become part of our team building and testing the Creditsafe data. You will be working closely with the database teams and data engineering to build specific systems facilitating the extraction and transformation of Creditsafe data. Based on the test strategy and approach you will develop,enhance and execute tests that add value to Creditsafe data. You will act as a primary source of guidance to Junior Test Engineers and Test Engineers in all areas of data quality. You will contribute to the team using data quality best practices and techniques. You can confidently communicate test results with your team members and stakeholders using evidence and reports. You act as a mentor and coach to the less experienced members of the test team. You will promote and coach leading practices in data test management, design, and implementation. You will be part of an Agile team and will effectively contribute to the ceremonies, acting as the quality specialist within that team. You are an influencer and will provide leadership in defining and implementing agreed standards and will actively promote this within your team and the wider development community. The ideal candidate has extensive experience in mentorship and leading by example and is able to communicate values consistent with the Creditsafe philosophy of engagement. You have critical thinking skills and can diplomatically communicate within and outside their areas of responsibility, challenging assumptions where required. Required Skills Proven working experience as a data test engineer or business data analyst or ETL tester. Technical expertise regarding data models, database design development, data mining and segmentation techniques Strong knowledge of and experience with SQL databases Hands on experience of best engineering practices (handling and logging errors,system monitoring and building human-fault-tolerant applications) Knowledge of statistics and experience using statistical packages for analysing datasets(Excel, SPSS, SAS etc.) is an advantage. Comfortable working with relational databases such as Redshift, Oracle,PostgreSQL, MySQL, and MariaDB (PostgreSQL preferred) Strong analytical skills with the abilityto collect, organise, analyse, and disseminate significant amounts of information with attention to detail and accuracy Adept at queries,report writing and presenting findings. BSin Mathematics, Economics, Computer Science, Information Management or Statistics is desirable but not essential A good understanding of cloud technology, preferably AWS and/orAzure DevOps A practical understanding of programming: JavaScript, Python Excellent communication skills Practical experience of testing in an Agile approach Desirable Skills An understanding of version control systems Practical experience of conducting code reviews Practical experience of pair testing and pair programming Primary Responsibilities Reports to Engineering Lead Work as part of the engineering team in data acquisition Designing and implementing processes and tools to monitor and improve the quality of Creditsafe's data. Developing and executing test plans to verify the accuracy and reliability of data. Working with data analysts and other stakeholders to establish and maintain data governance policies. Identifying and resolving issues with the data, such as errors,inconsistencies, or duplication. Collaborating with other teams, such as data analysts and data scientists to ensure the quality of data used for various projects and initiatives. Providing training and guidance to other team members on data quality best practices and techniques. Monitoring and reporting on key data quality metrics, such as data completeness and accuracy. Continuously improving data quality processes and tools based on feedback and analysis. Work closely with their Agile team to promote a whole team approach to quality Documents approaches and processes that improve the quality effort for use by team members and the wider test function Strong practical knowledge of software testing techniques and the ability to advise on, and select ,the correct technique dependent on the problem at hand Conducts analysis of the teams test approach, taking a proactive role in the formulation of the relevant quality criteria in line with the team goals Work with team members to define standards and processes applicable to their area of responsibility Monitor progress of team deliverables, injecting quality concerns in a timely, effective manner Gain a sufficient understanding of the system architecture to inform their test approach and that of the test engineers Creation and maintenance of concise and accurate defect reports in line with the established defect process Behavioural skills Teamwork – Leads by example in the areas of cooperation, collaboration and partnerships Quality Improvement – Takes the initiative to deliver improvements and results of value Problem Solving - Identifies and prioritises problems and works to deliver workable solutions Seeks feedback from team members and provides feedback to team members. Has an appreciation of others viewpoints, frequently soliciting differing opinions to their own Promotes an inclusive, merit-based approach to differing opinions Autonomy Is able to work independently within the constraints of their Agile team. Is able to determine when issues should be escalated. Takes responsibility and provides rationality for own decisions. Influence Interacts with and influences colleagues in a positive manner. Undertakes supervisory activities. Makes decisions which impact and optimises the work assigned to individuals or projects. Aspires to be regarded as the SME for quality related issues Complexity Is able to grasp complex concepts, form an understanding and explain to other team members. Is able to articulate complex concepts to stakeholders in a non-technical manner Performs a range of work, sometimes complex and non-routine, in a variety of environments. Applies a methodical approach to issue definition and resolution. Business skills Demonstrates an analytical and systematic approach to issue resolution, acting as the primary contact within their team. Takes the initiative in identifying and negotiating appropriate personal development opportunities with less experienced test team members Demonstrates effective communication skills and can vary message presentation dependent on the level of stakeholder Plans, schedules and monitors own work (and that of others) competently within limited deadlines and according to relevant legislation, standards and procedures. Appreciates the wider business context, and how their own role relates to other roles and to the business objectives of CreditSafe. Company Benefits: Competitive Salary Work from Home Pension Medical Insurance Cab facility for Women Dedicated Gaming Area

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Job Title: Database Engineer X 8 Positions Location: Hyderabad, India Salary: Market Rate/Negotiable About us Creditsafe is the most used business data provider in the world, reducing risk and maximizing opportunities for our 110,000 business customers. Our journey began in Oslo, Norway in 1997, where we had a dream of using the then revolutionary internet to deliver instant access company credit reports to small and medium-sized businesses. Creditsafe realized this dream and changed the market for the better for businesses of all sizes. From there, we opened 15 more offices throughout Europe, the USA and Asia. We provide data on more than 300 million companies and provide customer notifications for billions of changes annually. We are a high growth company offering the freedom and flexibility of a start-up type culture due to the continuous innovation and new product development performed, coupled with the stability of being a profitable and growing company! With such a large customer base and breadth of data and analytics technology you will have real opportunities to help companies survive and thrive in challenging times by reducing business risk and choosing trustworthy customers and suppliers. Summary: This is your opportunity to develop your career with an exciting, fast paced and rapidly expanding business, one of the leading providers of Business Intelligence worldwide. As a Database Engineer with excellent database development skills, you will be responsible for developing and maintaining the databases and scripts that power the company’s products and websites, handling large data sets and having more than 20 million hits per day. You will work with your team to deliver work on time, in-line with the business requirements, and to a high level of quality This is your opportunity to develop your career with an exciting, fast paced and rapidly expanding business, one of the leading providers of Business Intelligence worldwide. You will work with your team to deliver work on time, in-line with the business requirements, and to a high level of quality. Primary Responsibilities: · 5+ year’s solid commercial experience of Oracle development under a 10g or 11g environment. · Advanced PL/SQL knowledge required. · ETL skills – Pentaho would be beneficial · Any wider DB experience would be desirable e.g., Redshift, Aurora DB, DynamoDB, MariaDB, MongoDB etc. · Cloud/AWS An interest in learning new technologies. · Experience in tuning Oracle queries in large databases. · Good experience in loading and extracting large data sets. · Experience of working with an Oracle database under a bespoke web development environment. · Analytical and critical thinking skills; agile problem-solving abilities. · Detail oriented, self-motivated, able to work independently with little or no supervision, and is committed to the highest standards of quality for the entire release process. · Excellent written and verbal communication skills. · Attention to detail. · Ability to work in a fast paced / changing environment. · Ability to thrive in a deadline driven, stressful project environment.3+ years of software development experience. Qualifications and Experience · Degree in Computer Science or similar. · Experience with loading data through SSIS. · Experience working on financial and business intelligence projects or in big data environments. · A desire to learn new skills and branch into development using a wide range of alternative technologies. Skills, Knowledge and Abilities · Write code for new development requirements as well as provide bug fixing, support and maintenance of existing code. · Test your code to ensure it functions as per the business requirements, considering the impact of your code on other areas of the solution. · Provide expert advice on performance tuning within Oracle. · Perform large-scale imports and extracts of data. · Assist the business in the collection and documentation of user's requirements where needed, provide estimates and work plans · Create and maintain technical documentation. · Follow all company procedures/standards/processes. · Contribute to architectural design and development making technically sound development recommendations. · Provide support to other staff in the department and act as a mentor to less experienced staff, including through code reviews. · Work as a team player in an agile environment. · Build release scripts and plans to facilitate the deployment of your code to testing and production environments. · Take ownership of any issues that occur within your area to ensure an appropriate solution is found. Assess opportunities for application and process improvement and share with team members and/or affected parties. Company Benefits: Competitive Salary Work from Home Pension Medical Insurance Cab facility for Women Dedicated Gaming Area

Posted 3 weeks ago

Apply

5.0 - 8.0 years

0 Lacs

India

On-site

Bloomreach is building the world’s premier agentic platform for personalization .We’re revolutionizing how businesses connect with their customers, building and deploying AI agents to personalize the entire customer journey. We're taking autonomous search mainstream, making product discovery more intuitive and conversational for customers, and more profitable for businesses. We’re making conversational shopping a reality, connecting every shopper with tailored guidance and product expertise — available on demand, at every touchpoint in their journey. We're designing the future of autonomous marketing, taking the work out of workflows, and reclaiming the creative, strategic, and customer-first work marketers were always meant to do. And we're building all of that on the intelligence of a single AI engine — Loomi AI — so that personalization isn't only autonomous…it's also consistent.From retail to financial services, hospitality to gaming, businesses use Bloomreach to drive higher growth and lasting loyalty. We power personalization for more than 1,400 global brands, including American Eagle, Sonepar, and Pandora. We are seeking a Senior AI Engineer to join our dynamic team. In this role, you will be instrumental in building data-driven ML/AI algorithms that enhance our search and recommendation systems. Your primary focus will be on data engineering, analysis, transformations, model training, and serving, ensuring practical and scalable applications of machine learning within our products. This position emphasizes productization and the implementation of ML/AI solutions over pure data science and research, making it ideal for professionals thriving in the fast-paced generative AI era. Key Responsibilities Data Engineering & Analysis Slice and dice analytics data to formulate hypotheses and generate ideas to improve search and recommendation performance. Perform comprehensive data transformations to prepare datasets for model training and evaluation. Build and maintain data pipelines using tools like Airflow, Kubeflow, and MLflow to support ML/AI workflows. Model Development & Deployment Design, develop, and enhance machine learning and AI models tailored to product discovery and search functionalities. Conduct feature engineering to extract meaningful insights from historical data, search queries, product catalogs, and images. Collaborate with Data Engineers to integrate and scale ML components to production-level systems capable of handling large-scale data. Ensure seamless deployment of models, maintaining high availability and performance in cloud environments. Algorithm Implementation & Optimization Dive deep into algorithm applicability, performing impact analysis to ensure models meet performance and business objectives. Optimize and build new algorithms to address various challenges in product discovery and search. Productization of ML/AI Solutions Translate data-driven insights and models into actionable product features that enhance user experience. Work closely with Data Science, Product and Engineering teams to implement practical ML/AI applications that drive business outcomes. Continuous Learning & Improvement Stay abreast of the latest advancements in ML/AI, particularly in generative AI and large language models (LLMs). Continuously refine and improve existing models and workflows based on new research and industry trends. Qualifications Educational Background BS/MS degree in Computer Science, Engineering, Mathematics, or a related discipline with a strong mathematical foundation. Experience 5-8 years of experience building ML-driven, fast, and scalable ML/AI algorithms in a corporate or startup environment. Technical Skills Proficient in Python with excellent programming skills. Strong understanding of machine learning and natural language processing technologies, including classification, information retrieval, clustering, knowledge graphs, semi-supervised learning, and ranking. Experience with deep learning frameworks such as PyTorch, Keras, or TensorFlow. Proficient in SQL and experience with data warehouses like Redshift or BigQuery. Experience with big data technologies such as Hadoop, Spark, Kafka, and data lakes for large-scale processing. Strong understanding of data structures, algorithms, and system design for building highly available, high-performance systems. Experience with workflow orchestration and ML pipeline tools such as Airflow, Kubeflow, and MLflow. Specialized Knowledge Strong awareness of recent trends in Generative AI and Large Language Models (LLMs). Experience working with the GenAI stack is highly desirable. Soft Skills Excellent problem-solving and analytical skills with the ability to adapt to new ML technologies. Effective communication skills in English, both verbal and written. Ability to work collaboratively in a fast-paced, agile environment. More things you'll like about Bloomreach: Culture: A great deal of freedom and trust. At Bloomreach we don’t clock in and out, and we have neither corporate rules nor long approval processes. This freedom goes hand in hand with responsibility. We are interested in results from day one. We have defined our 5 values and the 10 underlying key behaviors that we strongly believe in. We can only succeed if everyone lives these behaviors day to day. We've embedded them in our processes like recruitment, onboarding, feedback, personal development, performance review and internal communication. We believe in flexible working hours to accommodate your working style. We work virtual-first with several Bloomreach Hubs available across three continents. We organize company events to experience the global spirit of the company and get excited about what's ahead. We encourage and support our employees to engage in volunteering activities - every Bloomreacher can take 5 paid days off to volunteer*. The Bloomreach Glassdoor page elaborates on our stellar 4.4/5 rating. The Bloomreach Comparably page Culture score is even higher at 4.9/5 Personal Development: We have a People Development Program -- participating in personal development workshops on various topics run by experts from inside the company. We are continuously developing & updating competency maps for select functions. Our resident communication coach Ivo Večeřa is available to help navigate work-related communications & decision-making challenges.* Our managers are strongly encouraged to participate in the Leader Development Program to develop in the areas we consider essential for any leader. The program includes regular comprehensive feedback, consultations with a coach and follow-up check-ins. Bloomreachers utilize the $1,500 professional education budget on an annual basis to purchase education products (books, courses, certifications, etc.)* Well-being: The Employee Assistance Program -- with counselors -- is available for non-work-related challenges.* Subscription to Calm - sleep and meditation app.* We organize ‘DisConnect’ days where Bloomreachers globally enjoy one additional day off each quarter, allowing us to unwind together and focus on activities away from the screen with our loved ones. We facilitate sports, yoga, and meditation opportunities for each other. Extended parental leave up to 26 calendar weeks for Primary Caregivers.* Compensation: Restricted Stock Units or Stock Options are granted depending on a team member’s role, seniority, and location.* Everyone gets to participate in the company's success through the company performance bonus.* We offer an employee referral bonus of up to $3,000 paid out immediately after the new hire starts. We reward & celebrate work anniversaries -- Bloomversaries!* (*Subject to employment type. Interns are exempt from marked benefits, usually for the first 6 months.) Excited? Join us and transform the future of commerce experiences! If this position doesn't suit you, but you know someone who might be a great fit, share it - we will be very grateful! Any unsolicited resumes/candidate profiles submitted through our website or to personal email accounts of employees of Bloomreach are considered property of Bloomreach and are not subject to payment of agency fees.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

We’re on the lookout for a skilled and motivated Data Engineer to join our growing tech team. If you’re passionate about building robust data pipelines, optimizing data workflows, and enabling smart data-driven decisions — we’d love to connect with you! Key Responsibilities: Design, build, and maintain scalable ETL/ELT pipelines Integrate data from multiple sources into centralized data stores Work closely with Data Analysts and Scientists to support analytical needs Optimize data delivery for performance and reliability Ensure data integrity, quality, and compliance Preferred Skills & Experience: 2–5 years of experience in Data Engineering Strong knowledge of SQL, Python, Spark/PySpark Experience with data warehousing (e.g., Snowflake, Redshift, BigQuery) Hands-on with ETL tools, data pipelines, and APIs Familiarity with cloud platforms (Azure, AWS, or GCP)

Posted 3 weeks ago

Apply

5.0 years

20 Lacs

Chandigarh

On-site

About the Role We are seeking a highly experienced and hands-on Fullstack Architect to lead the design and architecture of scalable, enterprise-grade software solutions. This role requires a deep understanding of both frontend and backend technologies, cloud infrastructure, and microservices, with the ability to guide teams through technical challenges and solution delivery. Key Responsibilities Architect, design, and oversee the development of full-stack applications using modern JS frameworks and cloud-native tools. Lead microservice architecture design, ensuring system scalability, reliability, and performance. Evaluate and implement AWS services (Lambda, ECS, Glue, Aurora, API Gateway, etc.) for backend solutions. Provide technical leadership to engineering teams across all layers (frontend, backend, database). Guide and review code, perform performance optimization, and define coding standards. Collaborate with DevOps and Data teams to integrate services (Redshift, OpenSearch, Batch). Translate business needs into technical solutions and communicate with cross-functional stakeholders. Required Skills Deep expertise in Node.js , TypeScript , React.js , Python , Redux , and Jest . Proven experience designing and deploying systems using Microservices architecture . Strong understanding of AWS services: API Gateway, ECS, Lambda, Aurora, Glue, SQS, OpenSearch, Batch. Hands-on with MySQL , Redshift , and writing optimized queries. Advanced knowledge of HTML, CSS, Bootstrap, JavaScript . Familiarity with tools: VS Code , DataGrip , Jira , GitHub , Postman . Strong knowledge of architectural design patterns and security best practices. Job Types: Full-time, Permanent Pay: From ₹2,055,277.41 per year Benefits: Health insurance Leave encashment Paid sick time Paid time off Provident Fund Schedule: Day shift Monday to Friday Education: Bachelor's (Preferred) Experience: Full-stack development: 5 years (Required) Location: Chandigarh, Chandigarh (Required) Shift availability: Day Shift (Required) Work Location: In person

Posted 3 weeks ago

Apply

7.0 - 12.0 years

5 - 7 Lacs

Hyderābād

Remote

Job Information Date Opened 07/08/2025 Job Type Full time Industry IT Services City Hyderabad State/Province Telangana Country India Zip/Postal Code 500059 About Us About DATAECONOMY: We are a fast-growing data & analytics company headquartered in Dublin with offices inDublin, OH, Providence, RI, and an advanced technology center in Hyderabad,India. We are clearly differentiated in the data & analytics space via our suite of solutions, accelerators, frameworks, and thought leadership. Job Description We are seeking a highly experienced and hands-on Lead/ Senior Data Engineer to architect, develop, and optimize data solutions in a cloud-native environment. The ideal candidate will have 7–12 years of strong technical expertise in AWS Glue, PySpark, and Python , along with experience designing robust data pipelines and frameworks for large-scale enterprise systems. Prior exposure to the financial domain or regulated environments is a strong advantage. Key Responsibilities: Solution Architecture : Design scalable and secure data pipelines using AWS Glue, PySpark, and related AWS services (EMR, S3, Lambda, etc.) Leadership & Mentorship : Guide junior engineers, conduct code reviews, and enforce best practices in development and deployment. ETL Development : Lead the design and implementation of end-to-end ETL processes for structured and semi-structured data. Framework Building : Develop and evolve data frameworks, reusable components, and automation tools to improve engineering productivity. Performance Optimization : Optimize large-scale data workflows for performance, cost, and reliability. Data Governance : Implement data quality, lineage, and governance strategies in compliance with enterprise standards. Collaboration : Work closely with product, analytics, compliance, and DevOps teams to deliver high-quality solutions aligned with business goals. CI/CD Automation : Set up and manage continuous integration and deployment pipelines using AWS CodePipeline, Jenkins, or GitLab. Documentation & Presentations : Prepare technical documentation and present architectural solutions to stakeholders across levels. Requirements Required Qualifications: 7–12 years of experience in data engineering or related fields. Strong expertise in Python programming with a focus on data processing. Extensive experience with AWS Glue (both Glue Jobs and Glue Studio/Notebooks). Deep hands-on experience with PySpark for distributed data processing. Solid AWS knowledge : EMR, S3, Lambda, IAM, Athena, CloudWatch, Redshift, etc. Proven experience in architecture and managing complex ETL workflows . Proficiency with Apache Airflow or similar orchestration tools. Hands-on experience with CI/CD pipelines and DevOps best practices. Familiarity with data quality , data lineage , and metadata management . Strong experience working in agile/scrum teams. Excellent communication and stakeholder engagement skills. Preferred/Good to Have: Experience in financial services, capital markets, or compliance systems . Knowledge of data modeling , data lakes , and data warehouse architecture . Familiarity with SQL (Athena/Presto/Redshift Spectrum). Exposure to ML pipeline integration or event-driven architecture is a plus. Benefits Flexible work culture and remote options Opportunity to lead cutting-edge cloud data engineering projects Skill-building in large-scale, regulated environments.

Posted 3 weeks ago

Apply

0 years

6 - 8 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Consultant –AWS! Responsibilities Design and deploy scalable, highly available , and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience of working with Oracle ERP Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Master's / Equivalent Job Posting Jul 7, 2025, 7:24:46 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 3 weeks ago

Apply

5.0 years

6 - 9 Lacs

Hyderābād

On-site

DevSecOps Engineer – CL4 Role Overview : As a DevSecOps Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive DevSecOps engineering craftsmanship and advanced proficiency across multiple programming languages, DevSecOps tools, and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused CI/CD and automation solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop DevSecOps engineering solutions that solve complex automation problems with valuable outcomes, ensuring high-quality, lean, resilient and secure pipelines with low operating costs, meeting platform/technology KPIs. Technical Leadership and Advocacy: Serve as the technical advocate for DevSecOps modern practices, ensuring integrity, feasibility, and alignment with business and customer goals, NFRs, and applicable automation/integration/security practices—being responsible for designing and maintaining code repos, CI/CD pipelines, integrations (code quality, QE automation, security, etc.) and environments (sandboxes, dev, test, stage, production) through IaC, both for custom and package solutions, including identifying, assessing, and remediating vulnerabilities. Engineering Craftsmanship: Maintain accountability for the integrity and design of DevSecOps pipelines and environments while leading the implementation of deployment techniques like Blue-Green, Canary to minimize down-time and enable A/B testing. Be always hands-on and actively engage with engineers to ensure DevSecOps practices are understood and can be implemented throughout the product development life cycle. Resolve any technical issues from implementation to production operations (e.g., leading triage and troubleshooting production issues). Be self-driven to learn new technologies, experiment with engineers, and inspire the team to learn and drive application of those new technologies. Customer-Centric Engineering: Develop lean, and yet scalable and flexible, DevSecOps automations through rapid, inexpensive experimentation to solve customer needs, enabling version control, security, logging, feedback loops, continuous delivery, etc. Engage with customers and product teams to deliver the right automation, security, and deployment practices. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a leaning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, engineering, delivery, infrastructure, and security. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Support a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess intermediary knowledge in modern software engineering practices and principles, including Agile methodologies, DevSecOps, Continuous Integration/Continuous Deployment. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery, ensuring high-quality outcomes with minimal waste. Demonstrate intermediate level understanding of the product development lifecycle, from conceptualization and design to implementation and scaling, with a focus on continuous improvement and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs into technical requirements and automations. Learn to navigate various enterprise functions such as product, experience, engineering, compliance, and security to drive product value and feasibility. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating technical concepts clearly and compellingly. Support teammates and product teams through well-structured arguments and trade-offs supported by evidence, evaluations, and research. Learn to create a coherent narrative that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Able to engage and collaborate with product engineering teams, including customers as needed. Able to build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Support diverse perspectives and consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes by leveraging a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : § A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. § Strong software engineering foundation with deep understanding of OOP/OOD, functional programming, data structures and algorithms, software design patterns, code instrumentations, etc. § 5+ years proven experience with Python, Bash, PowerShell, JavaScript, C#, and Golang (preferred). § 5+ years proven experience with CI/CD tools (Azure DevOps and GitHub Enterprise) and Git (version control, branching, merging, handling pull requests) to automate build, test, and deployment processes. § 5+ years of hands-on experience in security tools automation SAST/DAST (SonarQube, Fortify, Mend), monitoring/logging (Prometheus, Grafana, Dynatrace), and other cloud-native tools on AWS, Azure, and GCP. § 5+ years of hands-on experience in using Infrastructure as Code (IaC) technologies like Terraform, Puppet, Azure Resource Manager (ARM), AWS Cloud Formation, and Google Cloud Deployment Manager. § 2+ years of hands-on experience with cloud native services like Data Lakes, CDN, API Gateways, Managed PaaS, Security, etc. on multiple cloud providers like AWS, Azure and GCP is preferred. § Strong understanding of methodologies like, XP, Lean, SAFe to deliver high quality products rapidly. § General understanding of cloud providers security practices, database technologies and maintenance (e.g. RDS, DynamoDB, Redshift, Aurora, Azure SQL, Google Cloud SQL) § General knowledge of networking, firewalls, and load balancers. § Strong preference will be given to candidates with AI/ML and GenAI. § Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300653

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description At Amazon, we strive to be the most innovative and customer centric company on the planet. Come work with us to develop innovative products, tools and research driven solutions in a fast-paced environment by collaborating with smart and passionate leaders, program managers and software developers. This role is based out of our Bangalore corporate office and is for an passionate, dynamic, analytical, innovative, hands-on, and customer-centric Business analyst. Key job responsibilities This role primarily focuses on deep-dives, creating dashboards for the business, working with different teams to develop and track metrics and bridges. Design, develop and maintain scalable, automated, user-friendly systems, reports, dashboards, etc. that will support our analytical and business needs In-depth research of drivers of the Localization business Analyze key metrics to uncover trends and root causes of issues Suggest and build new metrics and analysis that enable better perspective on business Capture the right metrics to influence stakeholders and measure success Develop domain expertise and apply to operational problems to find solution Work across teams with different stakeholders to prioritize and deliver data and reporting Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation Basic Qualifications 5+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience Experience with data visualization using Tableau, Quicksight, or similar tools Experience with data modeling, warehousing and building ETL pipelines Experience using Advanced SQL to pull data from a database or data warehouse and scripting experience (Python) to process data for modeling Preferred Qualifications Experience with AWS solutions such as EC2, DynamoDB, S3, and Redshift Experience in data mining, ETL, etc. and using databases in a business environment with large-scale, complex datasets Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Company - ADCI - BLR 14 SEZ Job ID: A3009497

Posted 3 weeks ago

Apply

0 years

0 Lacs

India

Remote

Role We are looking for a Test Engineer who will become part of our team building and testing the Creditsafe data. You will be working closely with the database teams and data engineering to build specific systems facilitating the extraction and transformation of Creditsafe data. Based on the test strategy and approach you will develop, enhance and execute tests that add value to Creditsafe data. You will act as a primary source of guidance to Junior Test Engineers and Test Engineers in all areas of data quality. You will contribute to the team using data quality best practices and techniques. You can confidently communicate test results with your team members and stakeholders using evidence and reports. You act as a mentor and coach to the less experienced members of the test team. You will promote and coach leading practices in data test management, design, and implementation. You will be part of an Agile team and will effectively contribute to the ceremonies, acting as the quality specialist within that team. You are an influencer and will provide leadership in defining and implementing agreed standards and will actively promote this within your team and the wider development community. The ideal candidate has extensive experience in mentorship and leading by example and is able to communicate values consistent with the Creditsafe philosophy of engagement. You have critical thinking skills and can diplomatically communicate within, and outside their areas of responsibility, challenging assumptions where required. Required Skills: Proven working experience as a data test engineer or business data analyst or ETL tester. Technical expertise regarding data models, database design development, data mining and segmentation techniques Strong knowledge of and experience with SQL databases Hands on experience of best engineering practices (handling and logging errors, system monitoring and building human-fault-tolerant applications) Knowledge of statistics and experience using statistical packages for analysing datasets (Excel, SPSS, SAS etc.) is an advantage. Comfortable working with relational databases such as Redshift, Oracle, PostgreSQL, MySQL, and MariaDB (PostgreSQL preferred) Strong analytical skills with the ability to collect, organise, analyse, and disseminate significant amounts of information with attention to detail and accuracy Adept at queries, report writing and presenting findings. BS in Mathematics, Economics, Computer Science, Information Management or Statistics is desirable but not essential A good understanding of cloud technology, preferably AWS and/or Azure DevOps A practical understanding of programming: JavaScript, Python Excellent communication skills Practical experience of testing in an Agile approach Desirable Skills An understanding of version control systems Practical experience of conducting code reviews Practical experience of pair testing and pair programming Primary Responsibilities Reports to Engineering Lead Work as part of the engineering team in data acquisition Designing and implementing processes and tools to monitor and improve the quality of Creditsafe's data. Developing and executing test plans to verify the accuracy and reliability of data. Working with data analysts and other stakeholders to establish and maintain data governance policies. Identifying and resolving issues with the data, such as errors, inconsistencies, or duplication. Collaborating with other teams, such as data analysts and data scientists to ensure the quality of data used for various projects and initiatives. Providing training and guidance to other team members on data quality best practices and techniques. Monitoring and reporting on key data quality metrics, such as data completeness and accuracy. Continuously improving data quality processes and tools based on feedback and analysis. Work closely with their Agile team to promote a whole team approach to quality Documents approaches and processes that improve the quality effort for use byteam members and the wider test function Strong practical knowledge of software testing techniques and the ability to advise on, and select, the correct technique dependent on the problem at hand Conducts analysis of the teams test approach, taking a proactive role in the formulation of the relevant quality criteria in line with the team goals Work with team members to define standards and processes applicable to their area of responsibility Monitor progress of team deliverables, injecting quality concerns in a timely, effective manner Gain a sufficient understanding of the system architecture to inform their test approach and that of the test engineers Creation and maintenance of concise and accurate defect reports in line with the established defect process Job Types: Full-time, Permanent Benefits: Flexible schedule Health insurance Provident Fund Work from home Schedule: Monday to Friday Supplemental Pay: Performance bonus Work Location: In person Speak with the employer +91 9121185668

Posted 3 weeks ago

Apply

5.0 years

4 - 8 Lacs

Mohali

On-site

Company Introduction: - A dynamic company headquartered in Australia. Multi awards winner, recognized for excellence in telecommunications industry. Financial Times Fastest-growing Company APAC 2023. AFR (Australian Financial Review) Fast 100 Company 2022. Great promotion opportunities that acknowledge and reward your hard work. Young, energetic and innovative team, caring and supportive work environment. About You: We are seeking an experienced and highly skilled Data Warehouse Engineer to join our data and analytics team. Data Warehouse Engineer with an energetic 'can do' attitude to be a part of our dynamic IT team. The ideal candidate will have over 5 years of hands-on experience in designing, building, and maintaining scalable data pipelines and reporting infrastructure. You will be responsible for managing our data warehouse, automating ETL workflows, building dashboards, and enabling data-driven decision-making across the organization. Your Responsibilities will include but is not limited to: Design, implement, and maintain robust, scalable data pipelines using Apache NiFi, Airflow, or similar ETL tools. Develop and manage efficient data ingestion and transformation workflows, including web data crawling using Python. Create, optimize, and maintain complex SQL queries to support business reporting needs. Build and manage interactive dashboards and visualizations using Apache Superset (preferred), Power BI, or Tableau. Collaborate with business stakeholders and analysts to gather requirements, define KPIs, and deliver meaningful data insights. Ensure data accuracy, completeness, and consistency through rigorous quality assurance processes. Maintain and optimize the performance of the data warehouse, supporting high-availability and fast query response times. Document technical processes and data workflows for maintainability and scalability. To be successful in this role you will ideally possess: 5+ years of experience in data engineering, business intelligence, or a similar role. Strong proficiency in Python, particularly for data crawling, parsing, and automation tasks. Expert in SQL (including complex joins, CTEs, window functions) for reporting and analytics. Hands-on experience with Apache Superset (preferred), or equivalent BI tools like Power BI or Tableau. Proficient with ETL tools such as Apache NiFi, Airflow, or similar data pipeline frameworks. Experience working with cloud-based data warehouse platforms (e.g., Amazon Redshift, Snowflake, BigQuery, or PostgreSQL). Strong understanding of data modeling, warehousing concepts, and performance optimization. Ability to work independently and collaboratively in a fast-paced environment. Preferred Qualifications: Experience with version control (e.g., Git) and CI/CD processes for data workflows. Familiarity with REST APIs and web scraping best practices. Knowledge of data governance, privacy, and security best practices. Background in the telecommunications or ISP industry is a plus. Job Types: Full-time, Permanent Pay: ₹40,000.00 - ₹70,000.00 per month Benefits: Leave encashment Paid sick time Provident Fund Schedule: Day shift Monday to Friday Supplemental Pay: Overtime pay Performance bonus Work Location: In person

Posted 3 weeks ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Ahmedabad

On-site

Data Engineer - AWS (Financial Data Reconciliation) Exp 5 -6 years Location - On-Site Ahmedabad Technical Skills: AWS Stack: Redshift, Glue (PySpark), Lambda, Step Functions, CloudWatch, S3, Athena Languages: Python (Pandas, PySpark), SQL (Redshift/PostgreSQL) ETL & Orchestration: Apache Airflow (MWAA), AWS Glue Workflows, AWS Step Functions Data Modeling: Experience with financial/transactional data schemas. Data Architecture: Medallion (bronze/silver/gold) design, lakehouse patterns, slowly changing dimensions Job Type: Contractual / Temporary Contract length: 6 months Pay: ₹60,000.00 - ₹70,000.00 per month Schedule: Day shift Ability to commute/relocate: Ahmedabad, Gujarat: Reliably commute or planning to relocate before starting work (Preferred) Experience: AWS: 5 years (Preferred) Work Location: In person Expected Start Date: 10/07/2025

Posted 3 weeks ago

Apply

2.0 years

4 - 5 Lacs

Indore

On-site

Expertise in functional programming using JavaScript (ES5, ES6) Expertise of UI framework - React/Redux, RXJS Experience with React Native is preferred Preferred experience with a new generation of Web Programming - using Micro Service, REST/JSON, Component UI models Expertise with data visualization flow development along with usage of modern charting and graphical Java script library Preferred experience with Docker based development/deployment platform Preferred experience with AWS Cloud, AWS RedShift or Postgress React/Redux, RXJS, HTML, CSS, Javascript (ES5, ES6), Data visualization and Chart Libraries Experience: Minimum 2-3 Experience There is a 2-year bond with the company. Only those who are willing to commit to this bond should apply. Female candidates preferred only. For more information, please contact on this number: 8827277596 Job Types: Full-time, Permanent Pay: ₹40,000.00 - ₹45,000.00 per month Benefits: Paid sick time Paid time off Schedule: Day shift Supplemental Pay: Yearly bonus Experience: React: 2 years (Required) React Native: 2 years (Required) Work Location: In person Speak with the employer +91 8827277596

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

Remote

ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do: Lead end to end projects using cloud technologies to solve complex business problems Provide technology expertise to maximize value for clients and project teams Drive strong delivery methodology to ensure projects are delivered on time, within budget and to client’s satisfaction Ensure technology solutions are scalable, resilient, and optimized for performance and cost Guide coach and mentor project team members for continuous learning and professional growth Demonstrate expertise, facilitation, and strong interpersonal skills in internal and client interactions Collaborate with ZS experts to drive innovation and minimize project risks Work globally with team members to ensure a smooth project delivery Bring structure to unstructured work for developing business cases with clients Assist ZS Leadership with business case development, innovation, thought leadership and team initiatives What you’ll bring: Candidates must either be in their junior year of a Bachelor's degree or in their first year of a Master's degree specializing in Business Analytics, Computer Science, MIS, MBA, or a related field with academic excellence 5+ years of consulting experience in leading large-scale technology implementations Strong communication skills to convey technical concepts to diverse audiences Significant supervisory, coaching, and hands on project management skills Extensive experience with major cloud platforms like AWS, Azure and GCP Deep knowledge of enterprise data management, advanced analytics, process automation, and application development Familiarity with industry- standard products and platforms such as Snowflake, Databricks, Redshift, Salesforce, Power BI, Cloud. Experience in delivering projects using agile methodologies Additional skills: Capable of managing a virtual global team for the timely delivery of multiple projects Experienced in analyzing and troubleshooting interactions between databases, operating systems, and applications Travel to global offices as required to collaborate with clients and internal project teams Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com

Posted 3 weeks ago

Apply

1.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job title: Business Analyst, SailPoint (SAS) Success Acceleration Services About SailPoint: SailPoint is the leader in identity security for cloud enterprises. Our identity security solutions secure and enable thousands of companies worldwide, giving our customers unmatched visibility into the entirety of their digital workforce, ensuring workers have the right access to do their job – no more, no less. Built on a foundation of Artificial Intelligence and Machine Learning, our Identity Security Cloud Platform delivers the right level of access to the right identities and resources at the right time — matching the scale, velocity, and changing needs of today’s cloud-oriented, modern enterprise. About the role: The Success Acceleration Services team at SailPoint is looking for someone who is strongly motivated, has a keen sense of responsibility, positive attitude, high energy, strong attention to detail. This role will be to work with the SASP team to provide both day-to-day insights and support for our Services and delivery. This role will involve working with CRM and PSA tools to keep records up to date, forecasting accurately, and provide Services delivery governance ensuring operations are running smoothly. Roadmap for success 30 days: During the first 30 days, you will delve into understanding SailPoint's offerings, organizational structure, and team dynamics. You will have regular check-ins with your mentor, who will assist you in navigating the tools, processes, and active projects that are critical to your role. Familiarize yourself with project management and CRM-type tools alongside understanding the best practices that are used within the organization. Shadow ongoing Business Analyst activities, observing the dynamics of executing tasks and supporting the team that you are working with. 90 days: Take full ownership of administrative tasks and perform these independently. 6 months: At the 6-month mark, you should have developed a keen sense of the current administrative tasks at hand, ensuring clear boundaries between must-haves and nice-to-haves. Build and maintain strong relationships within and outside of the SAS Team. You should be able to point out areas of improvement in our current processes, propose ideas, collaborate with different team members on internal & external initiatives. You will serve as the primary point of contact for administrative requests. 1 year: By the end of your first year, you would have the ability to mentor new resources and grow team capability while successfully managing your own tasks. You will have the knowledge to create and maintain various knowledge bases to support Program development on an ad-hoc basis. Requirements: 2-3 years' experience working as a business analyst or administrative position demonstrating a high degree of productivity and effectiveness. Proven ability to coordinate between cross-functional teams, driving collaboration and resolving conflicts to maintain project/ program momentum Experience working with external stakeholders, for example communicating via email or CRM tools. Demonstrated ability to manage multiple tasks simultaneously and to resolve scheduling and other conflicts to meet all deadlines Highly self-driven and motivated with a strong work ethic & initiative Ability to work effectively in diverse teams, with an awareness of diverse cultural nuances and communication styles Ability to understand client needs, manage expectations, provide updates and deliver solutions that align with business objectives Excellent written and verbal communication skills, and ability to comprehensively and clearly present strategic issues and solutions. Experience in using and building dashboards using spreadsheet software's like Microsoft Excel and Smartsheet a strong plus Experience with Salesforce and ServiceNow Proven skills at cultivating strong working relationships and working well within a team to learn and share knowledge Collaborate with stakeholders to understand their needs and gather detailed business requirements. Analyze data to identify trends, patterns, and insights that inform business decisions. Evaluate internal systems for efficiency, problems, and inaccuracies, and develop and maintain protocols for handling, processing, and cleaning data. Ability to work in multiple time zones, specifically supporting the United States time zones. Education: Bachelor’s degree or equivalent experience (Computer Science or Engineering degree a plus). Preferred: Exposure to Customer Success Delivery and Operations in both large and small companies Proficiency in Redshift, PowerBI, SQL Experience with Identity Management, Security or Governance would be a bonus Certifications: ECBA, PCBA and CBAP are a plus to have About the team: We are a global dynamic, multicultural and multilingual team that thrives in a fast-paced, ever-evolving environment. From technical experts to senior management, we collaborate closely to tackle any situation head-on with a positive mindset. We are goal-driven and solution-focused, turning every challenge into an opportunity while supporting and learning from one another. Our team is passionate, curious, and always ready to dive deep, bringing people together to solve anything unknown and deliver results with professionalism and care. We work hard, move fast and continuously bring fresh ideas to the table, all while fostering a culture of growth, inclusion, and mutual respect. We invest in our people, champion their careers, and ensure our customers and business are always at the forefront. If you are proactive, eager to learn and ready to make a real impact, join us in shaping the future as part of this incredible worldwide operating team. SailPoint is an equal opportunity employer, and we welcome everyone to our team. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or veteran status. The 1-year goal expects the candidate to lead projects, mentor new members, and maintain knowledge bases. Can we add following point to Requirements: "Experience mentoring team members, leading initiatives, and contributing to knowledge-sharing through documentation or onboarding programs." This will ensure applicants are prepared for leadership and knowledge management responsibilities. SailPoint is an equal opportunity employer and we welcome all qualified candidates to apply to join our team. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other category protected by applicable law. Alternative methods of applying for employment are available to individuals unable to submit an application through this site because of a disability. Contact hr@sailpoint.com or mail to 11120 Four Points Dr, Suite 100, Austin, TX 78726, to discuss reasonable accommodations.

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Panaji, Goa, India

On-site

About the Project We are seeking an expert Senior Backend Developer to be a principal architect of "Stealth Prop-tech Platform" a groundbreaking digital real estate platform in Dubai. This is a complex initiative to build a comprehensive ecosystem integrating long-term sales, short-term stays, and advanced technologies including AI/ML, data analytics, Web3/blockchain, and conversational AI. You will be responsible for building the robust, scalable, and secure server-side logic that powers the entire platform across web, iOS, and Android clients. This is a critical leadership role in a high-impact project, offering the chance to design and build a sophisticated backend architecture from the ground up. Job Summary As a Senior Backend Developer, you will design, develop, and maintain the core services and APIs that drive the Proptech platform. You will be responsible for everything "under the hood," from database architecture to the business logic that handles property listings, user management, financial transactions, and advanced AI features. You will work closely with frontend and mobile developers, product managers, and data scientists to bring the platform's vision to life, ensuring high performance, reliability, and scalability. Key Responsibilities Design, build, and maintain scalable and secure RESTful APIs to serve all frontend clients (web, iOS, Android). Develop the core business logic for all platform features, including user authentication, property management (CRUD), search algorithms, and monetization systems (subscriptions, payments). Architect and manage the platform's database schema (PostgreSQL), ensuring data integrity, performance, and scalability. Lead the integration of numerous third-party services, including payment gateways (e.g., Stripe), mapping services (Google Maps), messaging APIs (Twilio), and virtual tour providers. Collaborate with the AI/ML team (Workstream 3) to build the data pipelines and API endpoints necessary to support features like recommendation engines and automated property valuations. Work with the Web3 team (Workstream 4) to integrate backend services with blockchain platforms for tokenization and cryptocurrency payment gateways. Implement robust security and data protection measures in line with international standards (e.g., UAE PDPL, Singapore PDPA). Mentor junior backend developers, conduct code reviews, and establish best practices for coding, testing, and deployment. Design and implement a scalable, service-oriented or microservices-based architecture to support long-term growth and feature expansion. Required Skills and Experience 5+ years of experience in backend software development , with a proven track record of building and launching complex, high-traffic applications. Expert proficiency in at least one modern backend programming language and framework (e.g., Python with Django/Flask, Node.js with Express, Go, or Java with Spring). Strong experience designing and building RESTful APIs and service-oriented architectures. Deep expertise in relational database design and management, particularly with PostgreSQL. Hands-on experience with cloud platforms (AWS, Google Cloud, or Azure) and deploying applications in a cloud environment. Solid understanding of software security principles and best practices. Experience with version control systems (Git) and CI/CD pipelines. Preferred Qualifications A Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Experience in the PropTech (Property Technology) or FinTech sectors. Experience working on projects that involve AI/ML, such as building APIs for recommendation systems or predictive models. Familiarity with blockchain concepts and experience integrating backend systems with Web3 technologies. Experience with containerization technologies like Docker and orchestration tools like Kubernetes. Knowledge of big data technologies (e.g., data warehouses like BigQuery/Redshift) and building data pipelines.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Greetings from Analytix Solutions…!!! We are seeking an experienced and motivated Senior Data Engineer to join our AI & Automation team . The ideal candidate will have 6–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data-driven insights across the organization. Company Name: Analytix Business Solutions (US Based MNC) Company At Glance: We are a premier knowledge process outsourcing unit based in Ahmedabad, fully owned by Analytix Solutions LLC, headquartered in the USA. We customize a wide array of business solutions including IT services, Audio-Visual services, Data management services & Finance and accounting services for small and mid-size companies across diverse industries. We partner with and offer our services to Restaurants, Dental services, Dunkin' Donuts franchises, Hotels, Veterinary services, and others including Start-ups from any other industry. For more details about our organization, please click on https://www.analytix.com/ LinkedIn : Analytix Business Solutions (India) Pvt. Ltd. : My Company | LinkedIn Roles & Responsibilities : Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. Architect and optimize data storage solutions to ensure reliability, security, and scalability. Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. Develop and enforce data quality standards, governance policies , and monitoring systems to ensure data integrity. Create and maintain comprehensive documentation for data systems, workflows, and models. Implement data modeling best practices and optimize data retrieval processes for better performance. Stay up-to-date with emerging technologies and bring innovative solutions to the team. Competencies & Skills : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. 6–8 years of experience in data engineering , designing and managing large-scale data systems. Advanced knowledge of Database Management Systems and ETL/ELT processes . Expertise in data modeling , data quality , and data governance . Proficiency in Python programming , version control systems (Git), and data pipeline orchestration tools . Familiarity with AI/ML technologies and their application in data engineering. Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. Excellent communication skills , with the ability to explain technical concepts to non-technical stakeholders. Ability to work independently, lead projects, and mentor junior team members. Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. Technology Stacks : Strong expertise in database technologies, including: o SQL Databases : PostgreSQL, MySQL, SQL Server o NoSQL Databases : MongoDB, Cassandra o Data Warehouse/ Unified Platforms : Snowflake, Redshift, BigQuery, Microsoft Fabric Hands-on experience implementing and working with generative AI tools and models in production workflows. Proficiency in Python and SQL , with experience in data processing frameworks (e.g., Pandas, PySpark). Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms . Strong understanding of data architecture , data modeling , and data governance principles . Experience with cloud platforms (preferably Azure ) and associated data services. Our EVP (Employee Value Proposition) : 5 Days working Total 24 Earned & Casual leaves and 8 public holidays Compensatory Off Personal Development Allowances Opportunity to work with USA clients Career progression and Learning & Development Loyalty Bonus Benefits Medical Reimbursement Standard Salary as per market norms Magnificent & Dynamic Culture

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

We are seeking a highly experienced and hands-on Lead/Senior Data Engineer to architect, develop, and optimize data solutions in a cloud-native environment. The ideal candidate will have 7–12 years of strong technical expertise in AWS Glue, PySpark, and Python, along with experience designing robust data pipelines and frameworks for large-scale enterprise systems. Prior exposure to the financial domain or regulated environments is a strong advantage. Key Responsibilities: Solution Architecture: Design scalable and secure data pipelines using AWS Glue, PySpark, and related AWS services (EMR, S3, Lambda, etc.) Leadership & Mentorship: Guide junior engineers, conduct code reviews, and enforce best practices in development and deployment. ETL Development: Lead the design and implementation of end-to-end ETL processes for structured and semi-structured data. Framework Building: Develop and evolve data frameworks, reusable components, and automation tools to improve engineering productivity. Performance Optimization: Optimize large-scale data workflows for performance, cost, and reliability. Data Governance: Implement data quality, lineage, and governance strategies in compliance with enterprise standards. Collaboration: Work closely with product, analytics, compliance, and DevOps teams to deliver high-quality solutions aligned with business goals. CI/CD Automation: Set up and manage continuous integration and deployment pipelines using AWS CodePipeline, Jenkins, or GitLab. Documentation & Presentations: Prepare technical documentation and present architectural solutions to stakeholders across levels. Requirements: Required Qualifications: 7–12 years of experience in data engineering or related fields. Strong expertise in Python programming with a focus on data processing. Extensive experience with AWS Glue (both Glue Jobs and Glue Studio/Notebooks). Deep hands-on experience with PySpark for distributed data processing. Solid AWS knowledge: EMR, S3, Lambda, IAM, Athena, CloudWatch, Redshift, etc. Proven experience in architecture and managing complex ETL workflows. Proficiency with Apache Airflow or similar orchestration tools. Hands-on experience with CI/CD pipelines and DevOps best practices. Familiarity with data quality, data lineage, and metadata management. Strong experience working in agile/scrum teams. Excellent communication and stakeholder engagement skills. Preferred/Good to Have: Experience in financial services, capital markets, or compliance systems. Knowledge of data modeling, data lakes, and data warehouse architecture. Familiarity with SQL (Athena/Presto/Redshift Spectrum). Exposure to ML pipeline integration or event-driven architecture is a plus. Benefits: Flexible work culture and remote options Opportunity to lead cutting-edge cloud data engineering projects Skill-building in large-scale, regulated environments.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies