Greater Boston Area
We are expanding our consulting team in the USA and are looking for our first Implementation Engineer on the ground in America. You’ll be part of a growing American HQ and will work with the solution architects and delivery leads to understand customers’ existing & emerging data privacy needs and solution design.
Privitar is a rapidly growing VC-backed company based in London, building software to enable the safe and ethical use of valuable data for analytics and machine learning. We work with large organisations worldwide in financial services, telecommunications, pharma and government, enabling them to get the most out of data without compromising on privacy and security.
Privitar is pioneering the new enterprise software category of Privacy Engineering to serve this emerging business need and address a social issue of growing importance. Our technology enables organisations to safely analyse and mine sensitive datasets while protecting an individual’s privacy.
- Provide expert knowledge of Privitar products covering all aspects of installation, configuration, maintenance and operation
- Provide expertise in data privacy definition and risk mitigation
- Deliver solutions to customers’ data privacy issues using Privitar products
- Drive customer requirement definition (size, scope, risk issues, exposure impact), data volumes, anticipated masking complexity, performance goals and the existing operational Hadoop or Data Flow environment
- Define and execute test strategies and acceptance criteria
- Collaborate with UK teams to leverage and develop existing product and implementation architecture knowledge base, best practices and defined implementation approaches
- Be comfortable working on multiple implementation projects in parallel
- Be the primary technical contact during project in order to address customer issues efficiently and quickly
- Identify opportunities to add value to existing customers beyond current scope
- Provide feedback to other Privitar teams - Engineering, Support, Sales - as needed in order to improve overall customer experienceBuild strong, enduring and profitable relationships with customers
- Travel to customer sites within North America, and possibly further afield, can be expected
- Bachelor’s degree in Computer Science or a Science or Engineering discipline
- Proven track record of delivering sophisticated product integrations in customer enterprise environments leveraging components such Hadoop, Data Flow, enterprise security modules, RDBMS, workflow automation, ETL tools and other related technologies
- Experience in gathering, reviewing and validating business and data requirements
- Strong Linux command line skills
- Experience with scripting languages (e.g. shell, python, perl)
- Experience with database schemas and SQL
- Proven ability to deliver results under pressure with rapidly evolving propositions, client demands and business needs
- You care deeply about customer success
- You enjoy the variety and fast pace of a dynamic start-up; you’re flexible in your approach and comfortable with ambiguity
- You have a good sense of humour! and think work should be fun as well as intellectually satisfying
- Operational experience of customer Hadoop deployments (Hortonworks, Cloudera) including primary operational components and tools (YARN, Spark, HDFS, Kerberos, KMS, Impala/Hive, Ambari/HUE, etc)
- Experience in HDFS usage and troubleshooting
- Experience integrating with customer RDBMS infrastructures
- Familiarity with Hadoop and Linux security infrastructure and Kerberos specifically
- Experience of common Data Flow / Streaming environments and technologies (e.g. Apache NiFi, Kafka, Confluent, StreamSets)
- Experience of workflow automation and Hadoop orchestration tools (Azkaban, Control- M, Oozie)
- Familiarity with configuration, maintenance and usage of Apache Tomcat and web applications
- Experience with performance tuning of Hadoop clusters - Spark an advantage.
- Experience working with typical Hadoop file formats (csv, Avro, Parquet, ORC) and compression techniques (Gzip, Bzip2, LZO, Snappy, Deflate, Sequence)
- Experience with Amazon AWS and other cloud platforms
- Broad knowledge of Hadoop and Linux security infrastructure
- Experience of integrating to LDAP-based directory services for authentication and authorisation
- Programming experience in Java, Python or similar
Please note that any offer will be subject to satisfactory completion of a background check.
Privitar does not accept unsolicited referrals or CVs from any source other than directly from candidates or approved agencies with written agreements in place and instructed on specified roles.
Unsolicited CVs received from any agency not engaged as outlined above will be considered a "free gift", and there will be no fees due should we choose to contact the candidate directly. Receipt of unsolicited CVs will in no way establish any prior claim to the candidate should they also be submitted by another agency. We consider this type of activity an attempt to lay claim to a given candidate and therefore entirely inappropriate. Any submission of unsolicited CVs to us will be deemed as full acceptance of these terms.
We only engage with agencies who are respectful of candidates, businesses and other agencies. We abide by our agreements with them and maintain genuine, straight-forward and lasting relationships which generate the highest calibre candidates for our business.
Read Full Job Description