Job Summary: We are looking for a detail-oriented Infrastructure Administrator with proven expertise in managing enterprise IT tools and cloud platforms. You will be responsible for setting up, configuring, and maintaining key systems like Tableau, AWS Resources, SAP BO, Collibra, SAS, and Alteryx. Key Responsibilities: Manage administration of Tableau Server, SAP BO, Alteryx Server, and AWS Services. Handle access provisioning, license management, and user governance for Collibra and Power Designer. Oversee system stability, performance optimization, and backup strategies. Monitor tool usage, patch versions, and ensure software compliance. Collaborate with IT and Data Governance teams for secure infrastructure practices. Must-Have Skills : Experience administering Tableau Server , including user permissions and dashboards Proficiency in managing AWS Resources (IAM roles, EC2, S3, monitoring tools) Strong working knowledge of Collibra and Information Steward for data governance Experience managing SAS , Alteryx Server , and other enterprise data tools Nice-to-Have Skills : Familiarity with Power Designer , Datagrip , and metadata repository tools Knowledge of enterprise security policies , auditing, and patch management Experience in regulated sectors like insurance, banking domains
We are hiring skilled Backend Developers to join our technology team supporting a top-tier client in the Retirement Pension Planning and Insurance domain. You'll work on large-scale enterprise data warehouse systems and develop robust, scalable data pipelines across real-time and batch environments. Roles & Responsibilities : Design, develop, and maintain scalable backend data pipelines using AWS Glue, PySpark, Lambda, and Kinesis . Implement both batch and real-time data ingestion and transformation flows using Alteryx . Collaborate with solution architects, analysts, and business stakeholders for data modeling and integration. Optimize data workflow performance, storage, and processing across multiple datasets. Troubleshoot data pipeline issues, maintain documentation, and ensure adherence to best practices. Work in agile teams and participate in sprint planning and code reviews. Technical Skills Required Must-Have: 3+ years of experience with AWS Glue , PySpark , and AWS Lambda Hands-on experience with AWS Kinesis or Amazon MSK Proficiency in scripting using Python Experience working with data warehouses and ETL frameworks Knowledge of batch and real-time data processing with Alteryx Good-to-Have: Understanding of data lake architectures and S3-based pipelines Familiarity with CI/CD tools for cloud deployment Basic knowledge of Data Governance tools or BI platforms (Tableau/Snowflake)
Job Summary: We are looking for a skilled Support Engineer to manage and troubleshoot AWS-based data systems supporting both frontend and backend applications. You will play a vital role in ensuring operational uptime, system availability, and incident resolution for enterprise-grade platforms. Key Responsibilities: Monitor AWS-hosted environments (S3, Lambda, RDS, EC2) for performance and health. Provide L2/L3 production support for real-time and batch data workflows. Perform root cause analysis and incident reporting. Create monitoring scripts, automate routine checks, and participate in release deployments. Collaborate with development, DevOps, and cloud infrastructure teams. Must-Have Skills : Proficiency in AWS infrastructure monitoring (CloudWatch, CloudTrail, Lambda logs) Strong knowledge of CI/CD pipelines (Jenkins, GitHub Actions, CodePipeline) Experience writing automation scripts in Python , Shell , or PowerShell Prior work supporting Enterprise Data Warehouse environments Nice-to-Have Skills : Familiarity with Terraform or CloudFormation for infrastructure as code Experience working under ITIL or similar service management frameworks Basic knowledge of frontend monitoring tools like Datadog or New Relic
About the Role We are seeking a Data Scientist with strong expertise in Python and SQL to join our growing data team. Youll work on data modeling, predictive analytics, and business intelligence projects that directly impact strategic decision-making. Key Responsibilities Extract, clean, and transform data from multiple sources using SQL and Python. Build predictive models and machine learning algorithms for business use cases. Perform exploratory data analysis (EDA) and generate actionable insights. Create interactive dashboards & reports (Power BI/Tableau). Optimize and maintain data pipelines, ETL processes, and workflows . Collaborate with cross-functional teams to translate business needs into analytical solutions. Present analytical results in a clear, visual, and business-friendly format. Required Skills 1–5 years of experience in Data Science or Data Analytics roles. Strong proficiency in Python (NumPy, Pandas, Scikit-learn, Matplotlib, Seaborn). Advanced SQL skills (complex queries, joins, window functions, optimization). Understanding of machine learning algorithms and statistical methods. Experience in data visualization tools (Power BI, Tableau, or similar). Strong problem-solving and analytical skills. Bachelor’s/Master’s in Computer Science, Data Science, Statistics, Mathematics, or related fields. Nice to Have Familiarity with cloud platforms (AWS, Azure, GCP). Exposure to Big Data tools (Spark, Hadoop). Basic understanding of APIs and automation scripts . What We Offer Competitive industry-standard salary + performance incentives. Flexible working hours & hybrid/remote opportunities. Exposure to real-world, high-impact projects .
FIND ON MAP