Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
11 - 15 Lacs
Bengaluru
Work from Office
BI Tools or Data Acceleration/Data Processing deployment and administration Any previous experience of administering In memory columnar databases like Exasol, Greenplum, Vertica, Snowflake Strong analytical and problem-solving skills Ability to communicate orally and in writing in a clear and straightforward manner with a broad range of technical and non-technical users and stakeholders Proactive and focused on results and success; conveys a sense of urgency and drives issues to closure Should be a team player and leader, flexible, hardworking, and self-motivated and have a positive outlook with the ability to take on difficult initiatives and challenges Ability to handle multiple concurrent projects First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer
Posted 1 month ago
5.0 - 8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job description: Job Description Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ͏ Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ͏ Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ͏ Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ͏ Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: HP Vertica . Experience: 5-8 Years . Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 1 month ago
12.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Title: Head of Database Administration 12-15Yrs Experince Location: Mumbai, India The Opportunity Netcore is seeking a visionary and hands-on Head of Database Administration to take complete ownership of our diverse and large-scale database ecosystem. This is a critical leadership role for a strategic partner who will not only ensure operational excellence but also drive the architectural and financial future of our data infrastructure. You will be responsible for a multitude of database technologies, including MySQL, MongoDB, Vertica, Cassandra, PostgreSQL, Elasticsearch, and Druid, with plans to incorporate BigQuery. This position demands a perfect blend of deep technical expertise across various database systems, strategic thinking, and proven leadership. You will lead the charge in optimizing our database costs, modernizing our architecture, and ensuring the security, reliability, scalability, and performance of the data platforms that power our core products. Key Responsibilities Strategic & Architectural Leadership Database Strategy: Define and execute a long-term database strategy that aligns with business objectives, focusing on high-scale, secure, and resilient systems. Architectural Ownership: Lead architectural decisions for all database solutions, embedding security-by-design principles to ensure scalability, resilience, and high availability across our multi-cloud (GCP/AWS) environment. Cost Optimization: Develop and implement aggressive strategies for database cost optimization and management across all platforms, providing regular reporting on savings and efficiency. Technology Research & Innovation: Lead research and evaluation of new and emerging database technologies and features to enhance performance, reduce costs, and modernize our stack. Drive the strategy for database upgrades and consolidation. Capacity Planning: Proactively forecast database capacity needs and develop plans to meet future growth in user base, transaction volumes, and data processing loads. Operational Excellence & Governance Database Operations: Own the end-to-end management, health, and performance of all production and non-production database environments. Monitoring & Alerting: Design, implement, and refine a comprehensive monitoring, logging, and alerting strategy to ensure system health and enable proactive issue resolution. Automation: Champion and implement automation for all facets of database administration, including provisioning, patching, configuration, and routine maintenance. Database Security & Hardening: Own and implement comprehensive database security measures, including data encryption (at-rest and in-transit), access control policies (IAM), vulnerability management, regular security audits, and database hardening best practices. Performance Tuning: Act as the final point of escalation for complex performance issues. Lead deep-dive analysis and tuning for our diverse database systems. Backup & Disaster Recovery: Architect, implement, and regularly test robust backup, restore, and disaster recovery plans to ensure data integrity and business continuity. Team Leadership & Collaboration Team Leadership: Build, mentor, and lead a high-performing team of Database Administrators and Engineers, fostering a culture of innovation and continuous improvement. Cross-Functional Partnership: Work closely with Development, DevOps, Security, and Product teams to streamline workflows and ensure the seamless delivery of reliable and performant applications. Security & Compliance: Champion and enforce robust data security policies and DevSecOps best practices throughout the database lifecycle. 15 Ensure all database systems are compliant with industry standards (e.g., SOC 2, ISO 27001) and data privacy regulations. Executive Communication: Effectively communicate database strategy, project status, risks, and performance metrics to executive leadership and key stakeholders. Essential Skills & Experience Experience: 12-15+ years in database engineering/administration, with at least 5 years in a leadership role managing teams in high-scale, 24x7 production environments. Polyglot Database Expertise: Deep, hands-on experience managing a diverse set of database technologies at scale, including: Relational Databases: MySQL, PostgreSQL. NoSQL Databases: MongoDB, Cassandra. Columnar/Analytics Databases: Vertica, Druid. Search Platforms: Elasticsearch. Database Security Expertise: Deep understanding of database security principles. This includes hands-on experience implementing encryption, network security, user access control (RBAC/IAM), vulnerability scanning, audit logging, and ensuring compliance with standards like SOC 2 and ISO 27001. Cloud & Cost Management: Proven experience managing databases in a multi-cloud environment (GCP, AWS). Demonstrable track record of implementing significant cost optimization strategies for cloud database workloads. Architectural Design: Strong experience in designing and implementing secure, scalable, and highly available database architectures. Automation Mindset: A strong automation mindset with proficiency in scripting languages (e.g., Python, Bash). Experience with Infrastructure as Code (e.g., Terraform) for database provisioning is highly preferred. Observability: Expertise in designing and implementing comprehensive monitoring and observability solutions for large-scale distributed databases (e.g., using Prometheus, Grafana, and cloud-native tools). Leadership & Communication: Exceptional ability to lead, inspire, and mentor a technical team. Capable of articulating complex technical concepts to both technical and non-technical audiences.
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
This position will be part of a growing team working towards building world class large scale Big Data architectures. This individual should have a sound understanding of programming principles, experience in programming in Java, Python or similar languages and can expect to spend a majority of their time coding. Location - Bengaluru Work Experience : 3 - 5 Years Responsibilities: Good development practices Hands on coder with good experience in programming languages like Java, Python, C++ or Scala. Good understanding of programming principles and development practices like checkin policy, unit testing, code deployment Self starter to be able to grasp new concepts and technology and translate them into large scale engineering developments Excellent experience in Application development and support, integration development and data management. Align Sigmoid with key Client initiatives Interface daily with customers across leading Fortune 500 companies to understand strategic requirements Stay up-to-date on the latest technology to ensure the greatest ROI for customer & Sigmoid Hands on coder with good understanding on enterprise level code Design and implement APIs, abstractions and integration patterns to solve challenging distributed computing problems Experience in defining technical requirements, data extraction, data transformation, automating jobs, productionizing jobs, and exploring new big data technologies within a Parallel Processing environment Culture Must be a strategic thinker with the ability to think unconventional / out:of:box. Analytical and data driven orientation. Raw intellect, talent and energy are critical. Entrepreneurial and Agile : understands the demands of a private, high growth company. Ability to be both a leader and hands on "doer". Qualifications: - Years of track record of relevant work experience and a computer Science or related technical discipline is required Experience with functional and object-oriented programming, Python or Scala a must Effective communication skills (both written and verbal) Ability to collaborate with a diverse set of engineers, data scientists and product managers Comfort in a fast-paced start-up environment . Preferred Qualification:- Technical knowledge in Spark, Hadoop & GCS Stack. Vertica, Snowflake, Druid a plus Development and support experience in Big Data domain Experience in agile methodology Experience with database modeling and development, data mining and warehousing. Experience in architecture and delivery of Enterprise scale applications and capable in developing framework, design patterns etc. Should be able to understand and tackle technical challenges, propose comprehensive solutions and guide junior staff Experience working with large, complex data sets from a variety of sources
Posted 1 month ago
6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Our client is a leading mobile marketing and audience platform that empowers the app ecosystem through cutting-edge solutions in mobile marketing, audience building, and monetization. With direct integration into over 500,000 monthly active mobile apps, leverages global first-party data to unlock valuable insights, predict behaviors, and drive growth. We are looking for an experienced and innovative Senior Business Analyst to join their Operational Department. Job Descriptions: Key Responsibilities: • Cross-Functional Collaboration: Act as a key analytics partner for business, product, and R&D teams, aligning projects with strategic goals. • Data Analysis & Insights: Design and execute analytics projects, including quantitative analysis, statistical modeling, automated monitoring tools, and advanced data insights. • Business Opportunity Identification: Leverage our client's extensive first-party data to identify trends, predict behaviors, and uncover growth opportunities. • Strategic Reporting: Create impactful dashboards, reports, and presentations to communicate insights and recommendations to stakeholders at all levels. • Innovation: Drive the use of advanced analytics techniques, such as machine learning and predictive modeling, to enhance decision-making processes. Requirements: • Experience: 6+ years as a Data Analyst (or similar role) in media, marketing, or a related industry • Technical Skills: Proficiency in SQL, and Excel, with experience working with large datasets and big data tools (e.g., Vertica, Redshift, Hadoop, Spark). Familiarity with BI and visualization tools (e.g., Tableau, MicroStrategy). • Analytical Expertise: Strong problem-solving skills, statistical modelling knowledge, and familiarity with predictive analytics and machine learning algorithms. • Strategic Thinking: Ability to align data insights with business objectives, demonstrating creativity and out-of-the-box thinking. • Soft Skills: Proactive, independent, collaborative, and results-driven with excellent communication skills in English. Educational Background: BSc in Industrial Engineering, Computer Science, Mathematics, or a related field (MSc/MBA is an advantage). *** Only candidates residing in Bangalore will be considered.
Posted 1 month ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Hello Folks! We’re Hiring – Senior Data Engineer (AWS) || Hyderabad || We are hiring for a product-based company for permanent roles! Join our innovative team and contribute to cutting-edge solutions. Job Title : Senior Data Engineer What you’ll be doing… (Data Engineer – Corporate Technology Data Engineering & Analytics)- (Full-Time, Hyderabad) The Opportunity Join our dynamic team as a Data Engineer – Corporate Technology Data Engineering & Analytics, where you'll play a pivotal role in driving the execution of our data and technology strategy. This role is crucial in driving digital transformation and operational efficiency across Investment Management. As part of this role, you will engage in building data solutions including streaming and batch pipelines, data marts & data warehouse. You will be responsible for establishing robust data collection and processing pipelines to fulfill Investment Management business requirements. The Team You'll be an integral part of our esteemed Corporate Technology Team, comprised of 6 stacks: Investments, Finance, Risk & Law, HR & Employee Experience (EE), Data Engineering & Analytics, Portfolio, and Strategy. Our team operates on a global scale, driving innovation and excellence across diverse areas of expertise. As a Data Engineer, you'll play a critical role in high impact Corporate Technology Investment Initiatives, ensuring alignment with organizational objectives and driving impactful outcomes. This is an opportunity to collaborate closely with Corporate Technology Data and Analytics team and Investment management business stakeholders. Our team thrives on collaboration, innovation, and a shared commitment to excellence. Together, we're shaping the future of technology within our organization and making a lasting impact on a global scale. Join us and be part of a dynamic team where your contributions will be valued and your potential unleashed. The Impact: • Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset. • Executes and provides feedback for data modeling policies, procedure, processes, and standards. • Assists with capturing and documenting system flow and other pertinent technical information about data, database design, and systems. • Develop comprehensive data quality standards and implement effective tools to ensure data accuracy and reliability. • Collaborate with various Investment Management departments to gain a better understanding of new data patterns. • Collaborate with Data Analysts, Data Architects, and BI developers to ensure design and development of scalable data solutions aligning with business goals. • Translate high-level business requirements into detailed technical specs. The Minimum Qualifications Education: Bachelor’s or Master’s degree in Computer Science, Information Systems or related field. Experience: • 7-9 years of experience with data analytics, data modeling, and database design. • 3+ years of coding and scripting (Python, Java, Scala) and design experience. • 3+ years of experience with Spark framework. • 5+ Experience with ELT methodologies and tools. • 5+ years mastery in designing, developing, tuning and troubleshooting SQL. • Knowledge of Informatica Power center and Informatica IDMC. • Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. • Strong data analysis skills for extracting insights from financial data • Proficiency in reporting tools (e.g., Power BI, Tableau). The Ideal Qualifications Technical Skills: • Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. • Familiarity with regulatory requirements and compliance standards in the investment management industry. • Experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. • Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion. Soft Skills: • Strong analytical and problem-solving abilities. • Exceptional communication and interpersonal skills. • Ability to influence and motivate teams without direct authority. • Excellent time management and organizational skills, with the ability to prioritize multiple initiatives. What to Expect as Part of ******** and the Team • Regular meetings with the Corporate Technology leadership team • Focused one-on-one meetings with your manager • Access to mentorship opportunities • Access to learning content on Degreed and other informational platforms • Your ethics and integrity will be valued by a company with a strong and stable ethical business with industry leading pay and benefits Experience :6+years (mandatory) Location: Hyderabad Work Mode: Hydrid Notice period: 60 days max In case intrested please share ur cv to Ramanjaneya.m@technogenindia.com
Posted 1 month ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Start.io is a mobile marketing and audience platform. Start.io empowers the mobile app ecosystem and simplifies mobile marketing, audience building, and mobile monetization. Start.io's direct integration with over 500,000 monthly active mobile apps provides access to unprecedented levels of global first-party data, which can be leveraged to understand and predict behaviors, identify new opportunities, and fuel growth. If you are a data enthusiast and want to participate in real-time data streams of billions of events from billions of users, your place is with us - Our data team is expanding, and we are actively seeking a passionate Data Analyst with expertise in numbers and SQL. In this role, you will have the exciting opportunity to: Work with large and complex data sets to solve a wide array of challenging problems using different analytical and statistical approaches. Apply technical expertise with quantitative analysis, experimentation, data mining, and the presentation of data to develop our business strategies for our products that serve billions of people. Identify and measure the success of product efforts through goal setting, forecasting, and monitoring of key product metrics to understand trends. Define, understand, and test opportunities and levers to improve the product, and drive roadmaps through your insights and recommendations. Partner with Business, Product, Engineering, and cross-functional teams to inform, influence, support, and execute product strategy and investment decisions. Share your thoughts and insights within our production environment, which fosters creativity and open communication. Thrive in a fast-paced company. What should you MUST bring? Bachelor's degree in engineering, scientific fields, or equivalent (e.g., industrial engineering, information systems, computer science, statistics) 5 years of practical experience with SQL & Python Fluent English and excellent communication skills (written and verbal Proficiency in MS Excel What will be an advantage? Experience working with large data sets and distributed computing tools (e.g., Vertica/redShift, Hadoop, Spark) – A big advantage . Knowledge or experience with statistical modeling, prediction, and ML algorithms - A big advantage . Experience with BI tools: Tableau, Microstrategy, Looker, etc. Ability to drive and manage end-to-end processes and collaborate with multiple interfaces. Experience in the Adtech/media ecosystem Other requirements: High-level analytical and problem-solving skills. Demonstrated business strategic and creative thinking. Proactive, independent, and self-motivated.
Posted 1 month ago
8.0 - 13.0 years
18 - 27 Lacs
Bengaluru
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.
Posted 1 month ago
8.0 - 13.0 years
18 - 30 Lacs
Pune
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.
Posted 1 month ago
8.0 - 13.0 years
18 - 25 Lacs
Hyderabad
Work from Office
About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for a Data Architect with creativity and results-oriented critical thinking to meet complex challenges and develop new strategies for acquiring, analyzing, modeling and storing data. In this role you will guide the company into the future and utilize the latest technology and information management methodologies to meet our requirements for effective logical data modeling, metadata management and database warehouse domains. You will be working with experts in a variety of industries, including computer science and software development, as well as department heads and senior executives to integrate new technologies and refine system performance. We reward dedicated performance with exceptional pay and benefits, as well as tuition reimbursement and career growth opportunities. What You?ll Do Define data retention policies Monitor performance and advise any necessary infrastructure changes Mentor junior engineers and work with other architects to deliver best in class solutions Implement ETL / ELT process and orchestration of data flows Recommend and drive adoption of newer tools and techniques from the big data ecosystem Expertise You?ll Bring 10+ years in industry, building and managing big data systems Building, monitoring, and optimizing reliable and cost-efficient pipelines for SaaS is a must Building stream-processing systems, using solutions such as Storm or Spark-Streaming Dealing and integrating with data storage systems like SQL and NoSQL databases, file systems and object storage like s3 Reporting solutions like Pentaho, PowerBI, Looker including customizations Developing high concurrency, high performance applications that are database-intensive and have interactive, browser-based clients Working with SaaS based data management products will be an added advantage Proficiency and expertise in Cloudera / Hortonworks Spark HDF and NiFi RDBMS, NoSQL like Vertica, Redshift, Data Modelling with physical design and SQL performance optimization Messaging systems, JMS, Active MQ, Rabbit MQ, Kafka Big Data technology like Hadoop, Spark, NoSQL based data-warehousing solutions Data warehousing, reporting including customization, Hadoop, Spark, Kafka, Core java, Spring/IOC, Design patterns Big Data querying tools, such as Pig, Hive, and Impala Open-source technologies and databases (SQL & NoSQL) Proficient understanding of distributed computing principles Ability to solve any ongoing issues with operating the cluster Scale data pipelines using open-source components and AWS services Cloud (AWS), provisioning, capacity planning and performance analysis at various levels Web-based SOA architecture implementation with design pattern experience will be an added advantage Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above.
Posted 1 month ago
7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role: Data Engineer Location: Hyderabad Experience: 7-9 Years Experience: 7-9 years of experience with data analytics, data modeling, and database design. 3+ years of coding and scripting (Python, Java, Scala) and design experience. 3+ years of experience with Spark framework. 5+ Experience with ELT methodologies and tools. 5+ years mastery in designing, developing, tuning and troubleshooting SQL. Knowledge of Informatica Power center and Informatica IDMC. Knowledge of distributed, column- orientated technology to create high-performant database technologies like - Vertica, Snowflake. Strong data analysis skills for extracting insights from financial data Proficiency in reporting tools (e.g., Power BI, Tableau). The Ideal Qualifications Technical Skills: Domain knowledge of Investment Management operations including Security Masters, Securities Trade and Recon Operations, Reference data management, and Pricing. Familiarity with regulatory requirements and compliance standards in the investment management industry. Experience with IBOR’s such as Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart. Familiarity with investment data platforms such as GoldenSource, FINBOURNE, NeoXam, RIMES, and JPM Fusion.
Posted 1 month ago
6.0 - 10.0 years
20 - 30 Lacs
Hyderabad
Hybrid
Position : Senior/Lead Python Developer Location : Hyderabad (Hybrid Mode) Duration : Fulltime with CASPEX End client EXPERIAN Python & FastAPI Lead Architect Microservices | AWS | Vertica Job Summary: We are seeking a highly skilled and experienced Python Lead and Architect to design and lead the development of scalable, high-performance microservices using FastAPI on AWS . The ideal candidate will have deep expertise in handling large-scale data processing with Vertica , and a strong background in architecting distributed systems and leading engineering teams. Key Responsibilities: Technology lead – With Handson experience of the development of microservices-based applications using Python and FastAPI . Design and implement scalable, secure, and resilient systems on AWS (ECS, Lambda, S3, RDS, etc.). Build and optimize data pipelines and services that interact with large datasets in Vertica . Collaborate with data engineers, DevOps, and product teams to deliver end-to-end solutions. Define and enforce coding standards, architecture principles, and best practices. Conduct code reviews, mentor junior developers, and drive continuous improvement. Ensure system performance, reliability, and observability through monitoring and logging. Lead the migration of legacy systems to modern, cloud-native architectures. Required Skills & Qualifications: 8+ years of experience in backend development with Python . 3+ years of hands-on experience with FastAPI and asynchronous programming. Strong experience in designing and deploying microservices architectures . Proficiency in AWS services and cloud-native application design. Deep understanding of Vertica or similar columnar databases for large-scale data processing. Experience with AWS – ECS, Fargate, Docker and CI/CD pipelines. Solid understanding of RESTful API design, security, and performance optimization. Familiarity with data modeling, ETL processes, and analytics workflows. Excellent problem-solving, communication, and leadership skills. Preferred Qualifications: Knowledge of data governance, compliance, and security best practices. Exposure to DevOps practices and infrastructure as code (Terraform, CloudFormation). Contributions to open-source projects or technical blogs.
Posted 1 month ago
3.0 years
0 Lacs
India
Remote
Role Performance Management Analyst Location: India, Remote Experience: 3+ years of experience in KPI design & performance frameworks Contract: 6 Months (Renewable based on performance) Key Qualifications: Bachelor’s in Business, IT, Data Science, or related field 3+ years of experience in KPI design & performance frameworks Proven expertise with Spider Impact or similar KPI tools Deep understanding of Balanced Scorecard methodology Experience integrating data from Oracle, VERTICA, ERP, CRM Skilled in designing scorecards, dashboards, and strategic KPIs Strong grip on data governance and reporting best practices What You’ll Do: Translate business strategy into measurable KPIs Configure and optimize KPI systems Deliver impactful scorecards & dashboards Ensure data accuracy & reporting excellence
Posted 1 month ago
1.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Type: Full Time Experience: 1 Years to 2 Years Type: Virtual Hiring Last Date: 30-June-2025 Posted on: 18-June-2025 Education: BE/B.Tech,MCA,ME/M.Tech ADVERTISEMENT No. 02 Data Scientist/ AI Engineer / 2 Posts Age: 25 to 35 years Qualification Mandatory: Full Time B.E./B. Tech – First class (minimum of 60% marks) or equivalent or M.E./M. Tech/ MCA in Computer Science/ IT/ Data Science/ Machine Learning and AI. Professional / Preferred Qualification: Certification in Data Science/ AI/ ML/ Natural Language Processing/ Web Crawling and Neural Networks. Experience Essential: 1. Minimum 03 years of (post basic educational qualification) experience in related field, out of which: 2+ years experience with programming languages frequently used in data science (R/ Python). 2+ years Experience in model development, model validation or related field. 2+ years experience in data analytics. 2+ years experience in Relational Database or any NoSQL database including Graph databases Experience in cloud-based application/ service development. Experience in natural language processing, Web Crawling and Neural Networks. Experience in projects with Machine learning/ Artificial Intelligence technologies. Excellent communication skills and ability to work as part of a multicultural product development team. End-to-end experience from data extraction to modelling and its validation. Experience of working in a project environment as a developer. Preference will be given to candidates with experience in financial sector/ banks/ NBFCs/ Insurance/ Investment firms. Mandatory Skill Set: 1. Technical expertise regarding data models/ database design development, data mining and segmentation techniques. Expertise in Machine Learning technologies. Expertise in testing & validation of quality and accuracy of AI models. Expertise in developing models using structured, semi-structured and unstructured data. Expertise in Analytical Databases like Vertica DB or similar platforms. Data Modelling and Data Intelligence/ Data Cataloguing skills with tools like Alation. SQL (DDL/ DML/ DQL). Desirable Qualities 1. Good understanding of Data Model and types of dimension modelling. 2. Experience in Conversational AI and dialogue systems. 3. Strong understanding of explainable and Responsible/ Ethical AI framework. 4. Understand Data protection techniques like encryption, data masking and tokenization to safeguard sensitive data in transit and at rest. 5. Experience in designing secure solutions architecture for Cloud platforms (private/ public/ Hybrid). 6. Experience with tools like Nifi, HBase, Spark, pig, storm, flume, etc. 7. Experience in BI tools. 8. Expertise in MS Excel data analytics. 9. Expertise in usage and deployment of LLMs Key Responsibilities: 1. Be self-motivated, pro-active, and demonstrate an exceptional drive towards service delivery. 2. Identify valuable data sources and automate collection/ collation processes. 3. Undertake preprocessing of structured and unstructured data. 4. Analyze information to discover trends and patterns. Use AI/ ML techniques to improve the quality of data or product offerings. 6. Find patterns and trends in datasets to uncover insights. 7. Create algorithms and data models to forecast outcomes. 8. Combine models through ensemble modelling. Data Scientist-cum-BI Developer /1 Post Age 23 to 30 years Qualification Mandatory: Full Time B.E./B. Tech – First class (minimum of 60% marks) or equivalent or M.E./M. Tech/ MCA in Computer Science/ IT/ Data Science/ Machine Learning and AI. Professional/ preferred qualification: Certification/ Assignments/ Projects in Data Science/ AI/ ML/ Natural Language Processing/ Web Crawling and Neural Networks. Experience Essential: 1. Minimum 01 year of (post basic educational qualification) working experience on assignments/ projects/ jobs related to ML/ AI. 2. Experience in projects with Machine learning/ Artificial Intelligence technologies. 3. Excellent communication skills and ability to work as part of a multicultural product development team. 4. End-to-end experience from data extraction to modelling and its validation. 5. Experience of working in a project environment as a developer. 6. Preference will be given to candidates with experience in financial sector/ banks/ NBFCs/ Insurance/ Investment firms. Mandatory Skill Set: 1. Technical expertise regarding data models/ database design development, data mining and segmentation techniques. Expertise in Machine Learning technologies. Expertise in testing & validation of quality and accuracy of AI models. Expertise in developing models using structured, semi-structured and unstructured data. Expertise in Analytical Databases like Vertica DB or similar platforms. Data Modelling and Data Intelligence/ Data Cataloguing skills with tools like Alation. SQL (DDL / DML/ DQL). Desired Skill Set: 1. Good understanding of Data Model and types of dimension modelling. 2. Experience in Conversational AI and dialogue systems. 3. Strong understanding of explainable and Responsible/ Ethical AI framework. 4. Understand Data protection techniques like encryption, data masking and tokenization to safeguard sensitive data in transit and at rest. 5. Experience in designing secure solutions architecture for Cloud platforms (private/ public/ Hybrid). 6. Experience with tools like Nifi, HBase, Spark, pig, storm, flume, etc. 7. Experience in BI tools. 8. Expertise in MS Excel data analytics. 9. Expertise in usage and deployment of LLMs Key Responsibilities 1. Be self-motivated, pro-active, and demonstrate an exceptional drive towards service delivery. 2. Identify valuable data sources and automate collection/ collation processes. 3. Undertake preprocessing of structured and unstructured data. Analyze information to discover trends and patterns. 5. Use AI/ ML techniques to improve the quality of data or product offerings. 6. Find patterns and trends in datasets to uncover insights. 7. Create algorithms and data models to forecast outcomes. 8. Combine models through ensemble modelling. Candidates can apply only ON-LINE on 16th June 2025 to 30 June 2025. Note: This is an aggregated job, sharing with a motive to intimate relevant opportunities with job seekers. Hireclap is not responsible / authorized for this recruitment process. Click Here For Job Details & Apply Online
Posted 1 month ago
3.0 years
0 Lacs
India
Remote
Role Performance Management Analyst Location: India, Remote Experience: 3+ years of experience in KPI design & performance frameworks Contract: 6 Months (Renewable based on performance) Key Qualifications: Bachelor’s in Business, IT, Data Science, or related field 3+ years of experience in KPI design & performance frameworks Proven expertise with Spider Impact or similar KPI tools Deep understanding of Balanced Scorecard methodology Experience integrating data from Oracle, VERTICA, ERP, CRM Skilled in designing scorecards, dashboards, and strategic KPIs Strong grip on data governance and reporting best practices What You’ll Do: Translate business strategy into measurable KPIs Configure and optimize KPI systems Deliver impactful scorecards & dashboards Ensure data accuracy & reporting excellence Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
India
Remote
We have an urgent requirement for STRATEGY ANALYST (Spider Impact platform)- PERFORMANCE MANAGEMENT with our client , Remote location. Job Purpose As a Performance Management Specialist, you will be responsible for the design, implementation, and management of performance measurement frameworks using Spider Impact Product. This includes developing KPI scorecards, dashboards, and business planning initiatives that align with strategic objectives. You will collaborate with cross-functional teams to define roles and responsibilities, integrate data from various source systems, and ensure the accuracy and reliability of performance data. The role requires strong analytical skills, hands-on experience with Spider Impact, and the ability to translate business goals into measurable outcomes. Principal Responsibilities The following are the key responsibilities to be performed daily in this Role. Platform Administration: Maintain the Spider Impact platform, including user access, permissions, security roles, and system settings. Scorecard & Dashboard Management: Design, configure, and update scorecards, KPIs, dashboards, charts, and strategic themes aligned with organizational goals. Data Integration: Collaborate with IT and business teams to integrate data from Excel, SQL, APIs, and other sources; manage historical data uploads and import templates developing up to 25 new scorecards annually. Automation & Alerts: Set up alerts, workflows, and notifications to support timely business interventions. Change Management: Prepare and submit change requests to Spider Impact’s authorized partner for system enhancements and modifications. Performance Review Facilitation: Prepare presentation-ready dashboards and reports for monthly, quarterly, and annual performance reviews. User Support: Act as the first point of contact for user support and troubleshooting. System Optimization: Monitor system adoption, recommend enhancements, and align usage with strategic priorities and best practices. Continuous Improvement: Stay current with Spider Impact upgrades and features to ensure optimal system utilization. TECHNICAL COMPETENCIES An Aspiring Data Analyst Candidate Requires a Wide Range Of Technical Competencies To Effectively Analyze Data, Build Data Models, Design Appealing Dashboards And Derive Valuable Insights. Here Are Some Of The Key Technical Competencies That Are Essential For a KPI Management Analyst Hands-on experience with Spider Impact platform for at least 3 years. Strong understanding of KPI frameworks, Balanced Scorecard methodology, and performance management. Proficiency in data integration from Oracle, VERTICA, ERP, CRM, and other enterprise systems. Ability to design and implement scorecards, dashboards, and strategic initiatives. Familiarity with data governance, data quality assurance, and reporting best practices. Strong analytical and problem-solving skills. Qualifications, Experience And Skills Desirable Qualification: Bachelor’s degree in computer science, Information Technology or a related field. Work Experience: Proven experience (3+Yeras) as a KPI Management using Spider Platform, or similar role, preferably in a large-scale organization or within the logistics industry. Technical Skills Required To Perform The Role Qualification: Bachelor’s degree in business, IT, Data Science, or a related field. Experience: Minimum 3 years of hands-on experience with Spider Impact or similar KPI management systems. Skills: Strong knowledge of KPI design and performance frameworks. Experience in data integration and system configuration. Excellent communication and stakeholder management skills. Ability to translate business strategy into measurable KPIs. Skills: kpi design,system configuration,stakeholder management,strategy,problem-solving skills,spider,spider impact platform,performance management,analytical skills,data integration,communication skills Show more Show less
Posted 1 month ago
25.0 years
0 Lacs
Kochi, Kerala, India
On-site
Company Overview Milestone Technologies is a global IT managed services firm that partners with organizations to scale their technology, infrastructure and services to drive specific business outcomes such as digital transformation, innovation, and operational agility. Milestone is focused on building an employee-first, performance-based culture and for over 25 years, we have a demonstrated history of supporting category-defining enterprise clients that are growing ahead of the market. The company specializes in providing solutions across Application Services and Consulting, Digital Product Engineering, Digital Workplace Services, Private Cloud Services, AI/Automation, and ServiceNow. Milestone culture is built to provide a collaborative, inclusive environment that supports employees and empowers them to reach their full potential. Our seasoned professionals deliver services based on Milestone’s best practices and service delivery framework. By leveraging our vast knowledge base to execute initiatives, we deliver both short-term and long-term value to our clients and apply continuous service improvement to deliver transformational benefits to IT. With Intelligent Automation, Milestone helps businesses further accelerate their IT transformation. The result is a sharper focus on business objectives and a dramatic improvement in employee productivity. Through our key technology partnerships and our people-first approach, Milestone continues to deliver industry-leading innovation to our clients. With more than 3,000 employees serving over 200 companies worldwide, we are following our mission of revolutionizing the way IT is deployed. Job Overview Job Summary: We are looking for a skilled Power BI Analyst with at least 3 years of experience in Power BI visualizations and a deep understanding of SQL. The ideal candidate will be responsible for creating interactive and insightful dashboards, optimizing data models, and ensuring data accuracy for business decision-making. This role requires strong analytical skills, business acumen, and the ability to transform complex datasets into meaningful insights. Key Responsibilities Power BI Development & Visualization Design and develop interactive dashboards and reports in Power BI that provide actionable insights to business users. Optimize data models, measures, and DAX calculations for efficient performance and accurate reporting. Create visually compelling charts, graphs, and KPIs to enhance decision-making across various business functions. Ensure the accuracy and consistency of reports by implementing data validation and cleansing techniques. Work closely with stakeholders to understand business requirements and translate them into impactful data visualizations. SQL & Data Management Write and optimize complex SQL queries to extract, manipulate, and analyse large datasets from multiple sources. Ensure data integrity by troubleshooting and resolving SQL-related issues. Assist in data modelling and ETL processes to improve the efficiency of data pipelines. Work with relational databases like SQL Server, PostgreSQL, MySQL, Snowflake, or Vertica. Collaboration & Stakeholder Management Partner with business teams to gather reporting needs and translate them into data-driven insights. Provide training and support to business users on Power BI dashboard usage. Work closely with data engineers, analysts, and IT teams to enhance data availability and quality. Required Qualifications & Experience: 3+ years of experience in Power BI development with strong expertise in DAX and Power Query. Proficiency in SQL with the ability to write and optimize complex queries. Strong understanding of data visualization best practices and dashboard performance optimization. Hands-on experience working with large datasets and relational databases. Experience integrating Power BI with different data sources (SQL Server, APIs, Excel, Cloud Data Warehouses, etc.). Preferred Experience with ETL tools, data modelling, and data warehousing concepts. Knowledge of Python or R for advanced data analysis (nice to have). Exposure to cloud platforms like Azure, AWS, or Google Cloud for data processing. Understanding of business intelligence (BI) and reporting frameworks. Skills & Competencies Power BI Mastery – Expert in building interactive dashboards, reports, and data visualizations. SQL Expertise – Ability to handle complex queries and optimize database performance. Problem Solving – Strong analytical and critical thinking skills. Communication – Ability to explain technical insights to non-technical stakeholders. Attention to Detail – Ensuring accuracy and reliability in reporting. Business Acumen – Understanding business needs and translating them into data-driven solutions. Compensation Estimated Pay Range: Exact compensation and offers of employment are dependent on circumstances of each case and will be determined based on job-related knowledge, skills, experience, licenses or certifications, and location. Our Commitment to Diversity & Inclusion At Milestone we strive to create a workplace that reflects the communities we serve and work with, where we all feel empowered to bring our full, authentic selves to work. We know creating a diverse and inclusive culture that champions equity and belonging is not only the right thing to do for our employees but is also critical to our continued success. Milestone Technologies provides equal employment opportunity for all applicants and employees. All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of race, color, religion, gender, gender identity, marital status, age, disability, veteran status, sexual orientation, national origin, or any other category protected by applicable federal and state law, or local ordinance. Milestone also makes reasonable accommodations for disabled applicants and employees. We welcome the unique background, culture, experiences, knowledge, innovation, self-expression and perspectives you can bring to our global community. Our recruitment team is looking forward to meeting you. Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
LivePerson (NASDAQ: LPSN) is the global leader in enterprise conversations. Hundreds of the world’s leading brands — including HSBC, Chipotle, and Virgin Media — use our award-winning Conversational Cloud platform to connect with millions of consumers. We power nearly a billion conversational interactions every month, providing a uniquely rich data set and safety tools to unlock the power of Conversational AI for better customer experiences. At LivePerson, we foster an inclusive workplace culture that encourages meaningful connection, collaboration, and innovation. Everyone is invited to ask questions, actively seek new ways to achieve success, nd reach their full potential. We are continually looking for ways to improve our products and make things better. This means spotting opportunities, solving ambiguities, and seeking effective solutions to the problems our customers care about. Overview LivePerson is experiencing rapid growth, and we’re evolving our database infrastructure to scale faster than ever. We are building a team dedicated to optimizing data storage, accessibility, and performance across our applications. As a Senior Database Engineer, you will be a key contributor, driving innovation in cloud database solutions and automation. You Will Partner with cross-functional teams to define database requirements and architectural strategies. Design, implement, and maintain highly scalable, on-prem and cloud-based database systems on Google Cloud Platform (GCP). Develop automation solutions using Terraform, Ansible, and Python to streamline database provisioning and management. Ensure robust version control of infrastructure configurations for seamless deployments. Monitor, troubleshoot, and optimize database performance, addressing bottlenecks proactively. Establish and enforce backup, recovery, and disaster recovery protocols to protect data integrity. Collaborate with security teams to implement compliance and data protection measures. Lead incident resolution, analyzing root causes and driving long-term solutions. Stay ahead of industry trends in DevOps, cloud computing, and database technologies. Participate in on-call rotations, ensuring 24x7 support for mission-critical systems. You Have 8+ years of experience managing large-scale production database systems handling terabytes of data. Expertise in MySQL administration & replication. Experience with anyone of Elasticsearch, Kafka, Hadoop, and Vertica is plus Strong background in Google Cloud Platform (GCP) or AWS database deployments. Proficiency in Infrastructure as Code (IaC) using Terraform & Ansible. Skilled in Python & Bash scripting for automation. Hands-on experience with Liquibase or Flyway for database automation. Knowledge of monitoring tools like Prometheus, Grafana, PMM (Percona Monitoring and Management) and ELK stack (Elasticsearch, Kibana & Logstash). Strong problem-solving skills with a proactive approach to troubleshooting complex issues. Solid foundation in database architecture, optimization, and CI/CD concepts. Excellent collaboration & communication skills in a dynamic team environment. Highly accountable with a results-driven mindset. Able to create documentation, work on changes, incidents and jira tickets. Relevant certifications (AWS, GCP) are a plus. Benefits Health: Medical, Dental and Vision Time away: Vacation and holidays Equal opportunity employer Why You’ll Love Working Here As leaders in enterprise customer conversations, we celebrate diversity, empowering our team to forge impactful conversations globally. LivePerson is a place where uniqueness is embraced, growth is constant, and everyone is empowered to create their own success. And, we're very proud to have earned recognition from Fast Company, Newsweek, and BuiltIn for being a top innovative, beloved, and remote-friendly workplace. Belonging At LivePerson We are proud to be an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, color, family or medical care leave, gender identity or expression, genetic information, marital status, medical condition, national origin, physical or mental disability, protected veteran status, race, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable laws, regulations and ordinances. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law. We are committed to the accessibility needs of applicants and employees. We provide reasonable accommodations to job applicants with physical or mental disabilities. Applicants with a disability who require reasonable accommodation for any part of the application or hiring process should inform their recruiting contact upon initial connection. The talent acquisition team at LivePerson has recently been notified of a phishing scam targeting candidates applying for our open roles. Scammers have been posing as hiring managers and recruiters in an effort to access candidates' personal and financial information. This phishing scam is not isolated to only LivePerson and has been documented in news articles and media outlets. Please note that any communication from our hiring teams at LivePerson regarding a job opportunity will only be made by a LivePerson employee with an @ liveperson.com email address. LivePerson does not ask for personal or financial information as part of our interview process, including but not limited to your social security number, online account passwords, credit card numbers, passport information and other related banking information. If you have any questions and or concerns, please feel free to contact recruiting-lp@liveperson.com Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Company Overview Domo's AI and Data Products Platform lets people channel AI and data into innovative uses that deliver a measurable impact. Anyone can use Domo to prepare, analyze, visualize, automate, and build data products that are amplified by AI. Position Summary Working as a member of Domo’s Client Services team, the Associate Technical Consultant will be focused on the implementation of fault tolerant, highly scalable solutions. The successful candidate will have a minimum of 3 years working hands-on with data. This individual will join an enthusiastic, fast-paced and dynamic team at Domo. A successful candidate will have demonstrated sustained exceptional performance, innovation, creativity, insight, good judgment. Key Responsibilities Partner with business users, technical teams to understand the data requirements and support solutions development; Assist in implementing best practices for data ingestion, transformation and semantic modelling; Aggregate, transform and prepare large data sets for use within Domo solutions; Ensure data quality and perform validation across pipelines and reports; Write Python scripts to automate governance processes; Ability to create workflows in DOMO to automate business processes; Build custom Domo applications or custom bricks to support unique client use cases; Develop Agent Catalysts to deliver generative AI-powered insights within Domo, enabling intelligent data exploration, narrative generation, and proactive decision support through embedded AI features; Continuously learn and apply best practices to drive customer enablement and success; Support the documentation of data pipelines and the development of artifacts for long-term customer enablement. Job Requirements 3+ years of experience supporting business intelligence systems in a BI or ETL Developer role; Expert SQL skills required; Expertise with Windows and Linux environments; Expertise with at least one of the following database technologies and familiarity with the others: relational, columnar and NoSQL (i.e. MySQL, Oracle, MSSQL, Vertica, MongoDB); Understanding of data modelling skills (i.e. conceptual, logical and physical model design - with both traditional 3rd normal form as well as dimensional modelling, such as star and snowflake); Experience dealing with large data sets; Goal oriented with strong attention to detail; Proven experience in effectively partnering with business teams to deliver their goals and outcomes; Bachelor's Degree in in Information Systems, Statistics, Computer Science or related field preferred OR equivalent professional experience; Excellent problem-solving skills and creativity; Ability to think outside the box; Ability to learn and adapt quickly to varied requirements; Thrive in a fast-paced environment. NICE TO HAVE Experience working with APIs; Experience working with Web Technologies (Javascript, Html, CSS); Experience with scripting technologies (Java, Python,R, etc.); Experience working with Snowflake, Data Bricks or Big Query is a plus; Experience defining scope and requirements for projects; Excellent oral and written communication skills, and comfort presenting to everyone from entry-level employees to senior vice presidents; Experience with statistical methodologies; Experience with a wide variety of business data (Marketing, Finance, Operations, etc); Experience with Large ERP systems (SAP, Oracle JD Edwards, Microsoft Dynamics, NetSuite, etc); Understanding of Data Science, Data Modelling and analytics. LOCATION: Pune, Maharashtra, India India Benefits & Perks Medical insurance provided Maternity and paternity leave policies Baby bucks: a cash allowance to spend on anything for every newborn or child adopted “Haute Mama”: cash allowance for maternity wardrobe benefit (only for women employees) Annual leave of 18 days + 10 holidays + 12 sick leaves Sodexo Meal Pass Health and Wellness Benefit One-time Technology Benefit: cash allowance towards the purchase of a tablet or smartwatch Corporate National Pension Scheme Employee Assistance Programme (EAP) Marriage leaves up to 3 days Bereavement leaves up to 5 days Domo is an equal opportunity employer. Show more Show less
Posted 1 month ago
5.0 - 8.0 years
30 - 35 Lacs
Bengaluru
Work from Office
BE/B Tech/MCA/MSc Comp science. -Only Detailed job description - Skill Set: 7+ years of experience with data analytics, data modeling, and database design. 5+ years of experience with Vertica. 2+ years of coding and scripting (Python/Java/Scala) and design experience. 2+ years of experience with Airflow. Experience with ELT methodologies and tools. Experience with GitHub. Expertise in tuning and troubleshooting SQL. Strong data integrity, analytical and multitasking skills. Excellent communication, problem solving, organizational and analytical skills. Mandatory Skills SQL, Python and Vertica
Posted 1 month ago
0 years
4 - 7 Lacs
Gurgaon
On-site
A Software Engineer is curious and self-driven to build and maintain multi-terabyte operational marketing databases and integrate them with cloud technologies. Our databases typically house millions of individuals and billions of transactions and interact with various web services and cloud-based platforms. Once hired, the qualified candidate will be immersed in the development and maintenance of multiple database solutions to meet global client business objectives Job Description: Key responsibilities: Have 2 – 4 yrs exp Will work in close Supervision of Tech Leads/ Lead Devs Should able to understand detailed design with minimal explanation. Individual Contributor. Resource will able to perform mid to complex level tasks with minimal supervision. Senior team members will peer review assigned tasks. Build and configure our Marketing Database/Data environment platform by integrating feeds as per detailed design/transformation logic. Good knowledge of Unix scripting &/or Python Must have strong knowledge in SQL Good understanding of ETL (Talend, Informatica, Datastage, Ab Initio etc) as well as database skills (Oracle, SQL server, Teradata, Vertica, redshift, Snowflake, Big query, Azure DW etc). Fair understanding of relational databases, stored procs etc. Experience in Cloud computing (one or more of AWS, Azure, GCP) will be plus. Less supervision & guidance from senior resources will be required. Location: DGS India - Gurugram - Golf View Corporate Towers Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 1 month ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Position: Database Location: Noida, India www.SEW.ai Who We Are: SEW, with its innovative and industry-leading cloud platforms, delivers the best Digital Customer Experiences (CX) and Workforce Experiences (WX), powered by AI, ML, and IoT Analytics to the global energy, water, and gas providers. At SEW, the vision is to Engage, Empower, and Educate billions of people to save energy and water. We partner with businesses to deliver platforms that are easy-to-use, integrate seamlessly, and help build a strong technology foundation that allows them to become future- ready. Searching for your dream job? We are a true global company that values building meaningful relationships and maintaining a passionate work environment while fostering innovation and creativity. At SEW, we firmly believe that each individual contributes to our success and in return, we provide opportunities from them to learn new skills and build a rewarding professional career. A Couple of Pointers: • We are the fastest growing company with over 420+ clients and 1550+ employees. • Our clientele is based out in the USA, Europe, Canada, Australia, Asia Pacific, Middle East • Our platforms engage millions of global users, and we keep adding millions every month. • We have been awarded 150+ accolades to date. Our clients are continually awarded by industry analysts for implementing our award-winning product. • We have been featured by Forbes, Wall Street Journal, LA Times for our continuous innovation and excellence in the industry. Who we are looking? An ideal candidate who must demonstrate in-depth knowledge and understanding of RDBMS concepts and experienced in writing complex queries and data integration processes in SQL/TSQL and NoSQL. T his individual will be responsible for helping the design, development and implementation of new and existing applications. Roles and Responsibilities: • Reviews the existing database design and data management procedures and provides recommendations for improvement • Responsible for providing subject matter expertise in design of database schemes and performing data modeling (logical and physical models), for product feature enhancements as well as extending analytical capabilities. •Develop technical documentation as needed. • Architect, develop, validate and communicate Business Intelligence (BI) solutions like dashboards, reports, KPIs, instrumentation, and alert tools. • Define data architecture requirements for cross-product integration within and across cloud-based platforms. • Analyze, architect, develop, validate and support integrating data into the SEW platform from external data source; Files (XML, CSV, XLS, etc.), APIs (REST, SOAP), RDBMS. • Perform thorough analysis of complex data and recommend actionable strategies. • Effectively translate data modeling and BI requirements into the design process. • Big Data platform design i.e. tool selection, data integration, and data preparation for predictive modeling • Required Skills: • Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. • 2-3 years of experience in Extraction, Transformation and Loading ETL work using data migration tools like Talend, Informatica, Datastage, etc. • 4-6 years of experience as a database developer in Oracle, MS SQL or other enterprise database with focus on building data integration processes. • Candidate should have any NoSql technology exposure preferably MongoDB. • Experience in processing large data volumes indicated by experience with Big Data platforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.). • Understanding data warehousing concepts and decision support systems. • Ability to deal with sensitive and confidential material and adhere to worldwide data security and Experience writing documentation for design and feature requirements. • Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure etc. • Excellent communication and collaboration skill Show more Show less
Posted 1 month ago
8.0 years
0 Lacs
Tamil Nadu, India
On-site
Job Title: Data Engineer About VXI VXI Global Solutions is a BPO leader in customer service, customer experience, and digital solutions. Founded in 1998, the company has 40,000 employees in more than 40 locations in North America, Asia, Europe, and the Caribbean. We deliver omnichannel and multilingual support, software development, quality assurance, CX advisory, and automation & process excellence to the world’s most respected brands. VXI is one of the fastest growing, privately held business services organizations in the United States and the Philippines, and one of the few US-based customer care organizations in China. VXI is also backed by private equity investor Bain Capital. Our initial partnership ran from 2012 to 2016 and was the beginning of prosperous times for the company. During this period, not only did VXI expand our footprint in the US and Philippines, but we also gained ground in the Chinese and Central American markets. We also acquired Symbio, expanding our global technology services offering and enhancing our competitive position. In 2022, Bain Capital re-invested in the organization after completing a buy-out from Carlyle. This is a rare occurrence in the private equity space and shows the level of performance VXI delivers for our clients, employees, and shareholders. With this recent investment, VXI has started on a transformation to radically improve the CX experience though an industry leading generative AI product portfolio that spans hiring, training, customer contact, and feedback. Job Description: We are seeking talented and motivated Data Engineers to join our dynamic team and contribute to our mission of harnessing the power of data to drive growth and success. As a Data Engineer at VXI Global Solutions, you will play a critical role in designing, implementing, and maintaining our data infrastructure to support our customer experience and management initiatives. You will collaborate with cross-functional teams to understand business requirements, architect scalable data solutions, and ensure data quality and integrity. This is an exciting opportunity to work with cutting-edge technologies and shape the future of data-driven decision-making at VXI Global Solutions. Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to ingest, transform, and store data from various sources. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Implement data models and schemas to support analytics, reporting, and machine learning initiatives. Optimize data processing and storage solutions for performance, scalability, and cost-effectiveness. Ensure data quality and integrity by implementing data validation, monitoring, and error handling mechanisms. Collaborate with data analysts and data scientists to provide them with clean, reliable, and accessible data for analysis and modeling. Stay current with emerging technologies and best practices in data engineering and recommend innovative solutions to enhance our data capabilities. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven 8+ years' experience as a data engineer or similar role Proficiency in SQL, Python, and/or other programming languages for data processing and manipulation. Experience with relational and NoSQL databases (e.g., SQL Server, MySQL, Postgres, Cassandra, DynamoDB, MongoDB, Oracle), data warehousing (e.g., Vertica, Teradata, Oracle Exadata, SAP Hana), and data modeling concepts. Strong understanding of distributed computing frameworks (e.g., Apache Spark, Apache Flink, Apache Storm) and cloud-based data platforms (e.g., AWS Redshift, Azure, Google BigQuery, Snowflake) Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker, Apache Superset) and data pipeline tools (e.g. Airflow, Kafka, Data Flow, Cloud Data Fusion, Airbyte, Informatica, Talend) is a plus. Understanding of data and query optimization, query profiling, and query performance monitoring tools and techniques. Solid understanding of ETL/ELT processes, data validation, and data security best practices Experience in version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills to work effectively with cross-functional teams. Join VXI Global Solutions and be part of a dynamic team dedicated to driving innovation and delivering exceptional customer experiences. Apply now to embark on a rewarding career in data engineering with us! Show more Show less
Posted 1 month ago
10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
About Us Lemma Technologies is a software start-up company based in Baner Pune. We are unleashing the power of programmatic AdTech to the DOOH ( Digital out of home ) world. Our Mission is to Transform Digital Out Of Home media to connect Brands with their Consumer by Establishing Authentic and Transparent Standards. Innovation is our DNA and Transparency is our RNA We are Revolutionising the DOOH industry. As an organisation, we successfully deliver brand stories seamlessly across all large format digital screens from DOOH to CTV and even on mobile and desktop devices. We are focussed on connecting DOOH media to mainstream digital, enabling brands to deploy omni-digital strategies through our platform. Roles & Responsibilities Chief Data Scientist /Architect of Lemma Technologies. This role will be responsible to define and execute the technical strategy for adoption of modern AI / ML practices to acquire, process data and provide actional insights to Lemma customers. Good understanding of the entire journey of Data acquisition, Data warehouse, Information Architecture, Dashboard, Reports, Predictive Insights, Adoption of AI / ML and NLP and provide innovative data oriented insights for Lemma customers Deep understanding of Data science and Technology and can recommend adoption of right technical tools and strategies. Expected to be hands on technical expert who will build and guide a technical data team Build, design and implement our highly scalable, fault-tolerant, highly available big data platform to process terabytes of data and provid customers with in-depth analytics. Deep data science and AI/ML hands-on experience to give actionable insights to advertisers/ customers of Lemma Good overview of modern technology stack such as Spark, Hadoop, Kafka, HBase, Hive, Presto etc. Automate high-volume data collection and processing to provide real time data analytics. Customize Lemmas reporting and analytics platform based on customers requirements from customers and deliver scalable, production-ready solutions. Lead multiple projects to develop features for data processing and reporting platform, collaborate with product managers, cross-functional teams, other stakeholders and ensure successful delivery of projects. Leveraging a broad range of Lemmas data architecture strategies and proposing both data flows and storage solutions. Managing Hadoop map reduce and spark jobs & solving any ongoing issues with operating the cluster. Working closely with cross functional teams on improving availability and scalability of large data platform and functionality of Lemma software. Participate in Agile/Scrum processes such as sprint planning, sprint retrospective, backlog grooming, user story management, work item prioritization, etc.. Skills Required 10 to 12+ years of proven experience in designing, implementing, and delivering complex, scalable, and resilient platform and services. Experience in building AI, machine learning, Data Analytics Experience in OLAP (Snowflake, Vertica or similar) would be an added advantage. Ability to understand vague business problems and convert into working solutions. Excellent spoken and written interpersonal skills with a collaborative approach. Dedication to developing high-quality software and products. Curiosity to explore and understand data is a strong plus Deep understanding of Big-Data and distributed systems (MapReduce, Spark, Hive, Kafka, Oozie, Airflow) (ref:hirist.tech) Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
OPENTEXT OpenText is a global leader in information management, where innovation, creativity, and collaboration are the key components of our corporate culture. As a member of our team, you will have the opportunity to partner with the most highly regarded companies in the world, tackle complex issues, and contribute to projects that shape the future of digital transformation. Your Impact Sr. Technical Support Specialists are responsible for providing exceptional technical support on OpenText products. As a Senior Technical Support Specialist, you will reproduce, troubleshoot, and resolve customer issues. You’ll identify defects and escalate to OpenText Product Engineering, and test software patches for customers. You will be recognized by your peers as an expert in your chosen product area. This position offers you an opportunity to learn exciting technologies and exercise critical and creative thinking. Our strong team-based environment ensures that our team members support each other to deliver excellent Customer Experience. What The Role Offers 3+ years’ experience in a technical support environment. Flexible to provide on-call / outside business support hours as, and when, needed. A Science /Technology Engineering or bachelor’s degree preferred. Strong analytical and critical thinking skills. Strong verbal and written communication skills. Proven experience working in a fluid environment that is ever growing and changing. Ability to multi-task and prioritize work effectively. Strong attention to detail and the ability to grasp concepts quickly with a thirst for knowledge. What You Need To Succeed Hands-on experience troubleshooting Windows/Linux Operating Systems. Strong troubleshooting skills, diagnostic analysis using traces, dumps and other tools, and hypothesis formulation and testing. Database knowledge - PostgreSQL, Oracle, MS SQL, Vertica Network and security protocols like TPC/IP, HTTP, TLS/SSL, REST API, SOAP and SAML Virtualization Skills – VMware, Hyper-V Experience on Cloud technologies – AWS, Azure or Google Cloud Good Scripting knowledge - Perl, Python, Shell Must be familiar with HA and DR setup. Experience in Docker/Kubernetes Experience in Web Service/Java script Identity Management, Access Management, Data Security, Application Security SIEM applications. Experience on Cloud technologies – AWS, Azure Familiarity with containerization tools like Docker or Kubernetes Experience in Docker/Kubernetes is a plus. OpenText's efforts to build an inclusive work environment go beyond simply complying with applicable laws. Our Employment Equity and Diversity Policy provides direction on maintaining a working environment that is inclusive of everyone, regardless of culture, national origin, race, color, gender, gender identification, sexual orientation, family status, age, veteran status, disability, religion, or other basis protected by applicable laws. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please contact us at hr@opentext.com. Our proactive approach fosters collaboration, innovation, and personal growth, enriching OpenText's vibrant workplace. Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France