Latest Jobs at Sama

Computer Science jobs, Information Technology jobs, Data Engineering jobs,

Enterprise System Administrator and Developer

Key Responsibilities

  • Workforce Management system administrator
  • System configuration and Implementation of custom modules/extensions/capabilities on the Workforce Management solution
  • Create and maintain system documentation and technical process flows
  • Own the data life cycle management of the workforce management tool and overall data architecture of the Workforce Management system
  • Oversee Workforce Management Data governance which includes:
    • Data standards
    • Data audit
    • Data pipelines
    • API connections and Integrations
    • User access rights
    • Reporting and analytics
  • Collaborate with IT on defining the capabilities and driving the implementation of the required Workforce management infrastructure
  • Ensure optimal and consistent availability and data quality of the WFM tool.
  • Be the go-to person for integrations between the Workforce Management and other enterprise tools.
  • Will liaise with the vendor teams for escalations and other relevant situations that require technical support from the vendor.
  • Point of escalation for system gaps, access, and other issues for Global Service Delivery functions.
  • Liaise with the Director of Information Security and other Information Security stakeholders to ensure proper security setup for the WFM platform.

Minimum Technical Qualifications

  • 3+ years of overall IT or programming experience
  • Bachelor’s Degree in Computer Science, Information Technology or relevant field of Study
  • Technical expertise in system administration, configuration and additional custom feature development
  • Experience in working with Enterprise SaaS platforms (Zoho preferred)
  • Expertise in configuring and customizing PaaS platforms (Zoho Creator+Deluge preferred)
  • Experience working with relational and non-relational databases – Bigquery, AWS, Postgres is an added advantage
  • Working knowledge of data pipelining tools such as Hevo, Stitch data, Pentaho
  • Experience with transformation tools such as DataForm, and Database Tools (DBT).
  • Experience with object-oriented/object function scripting languages: Python, JavaScript, Java
  • Experience working on CI/CD processes and source control tools such as GitHub.
  • A successful history of manipulating, processing, and extracting value from large disconnected datasets

Minimum Preferred Qualifications

  • Outstanding communication skills, and the ability to stay self-motivated and work with little or no supervision
  • Great communication and collaboration skills.
  • Excellent time management and organizational abilities.

Data Engineer

Key Responsibilities: 

  • Create and maintain optimal data pipeline architectures that serve key business stakeholders
  • Assemble large, complex data sets that meet business requirements for different stakeholders and teams.
  • Build and maintain the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
  • Develop and maintain a data catalog of data sets, scripts, tools and pipelines as part of documentation.
  • Work with stakeholders to identify their data needs and provide consistent data availability and quality to meet those needs.
  • Work with business analytics to build ETL pipelines that serve various areas of business.
  • Identify any bottlenecks or challenges in the current data pipelining approaches and suggest areas of improvement.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build analytics tools that utilize the data pipeline to provide actionable insights on key business metrics
  • Maintain the daily relationship with stakeholders to understand their data needs and communicate results intuitively.

Minimum Qualifications:

  • Advanced working knowledge of SQL.
  • Experience with Google Cloud Platform and its services.
  • Experience working with relational and non-relational databases – Bigquery, AWS, Postgre..
  • Working knowledge of data pipelining tools such as Hevo
  • Experience with transformation tools such as DataForm, Database Tools (DBT).
  • Experience with object-oriented/object function scripting languages: Python, JavaScript, Java, C++.
  • Experience working on CI/CD processes and source control tools such as GitHub.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.

Preferred Qualifications:

  • Outstanding communication skills, and the ability to stay self-motivated and work with little or no supervision.
  • Great communication and collaboration skills.
  • Excellent time management and organizational abilities

APPLY HERE