The Salesforce ETL Developer manages end-to-end data pipelines that import monthly Employer payroll files into Salesforce, billing and invoicing, data migrations, integrations, data warehouse ETL, and BI reporting tools.
The data is primarily hosted on Salesforce and AWS Aurora.
Responsibilities
Manage end-to-end pipeline, including
• Meet with clients / payroll employers to define data pipeline strategies
• Enforce “principle of least privilege” security policy on all infrastructure
• Manage FTP/Portal file upload environments
• CSV and Fixed-Width file parsing
• Data map flat files to relational databases
• Batch processing (custom and configurable)
• Develop declarative and programmatic transform rules
• Implement data cleansing and quality management
• Data enrichment processes through various API integrations
• Enforce data backup, retention, and recovery policies
• Commitment to iterative / incremental continuous integration lifecycle
Requirements
2+ years experience with the following:
• SQL / SOQL query languages
• DML data manipulation language
• Apex or Java object oriented programming
• Linux command line interface (Bash shell and SSH)
• Message Queues
Familiarity with any of the following preferred:
• Salesforce.com
• Amazon Web Services (AWS)
• AWS Aurora / MySQL RDS
• Star Schema data structure
• AWS Lambda functions / Step functions
• Mulesoft
• Tableau
SOS celebrates diversity and is eager to talk to all qualified candidates.
Send cover letter and resume to: sosjobs@googlegroups.com
U.S. based candidates only please.