Job Board
for underrepresented folks in tech

Back to all jobs
Photo of C2FO

C2FO
emoji of a pushpinRemote-US

At C2FO, our mission is to deliver a future where every company around the world has the capital needed to grow.

C2FO is working to deliver a future where every company in the world has the capital it needs to grow. Our technology provides an easy, low-cost way for businesses of all sizes to increase cash flow by receiving early invoice payments. Since 2008, C2FO’s online marketplace and innovative financial products have accelerated payments by more than one billion days for companies in over 180 countries.

Named one of Forbes’ “Fintech 50,” C2FO provides more than $1 billion in working capital each week for hundreds of thousands of businesses. C2FO has more than 400 employees worldwide, with headquarters in Kansas City and locations throughout Europe, Asia Pacific and Australia. For more information, visit www.C2FO.com/.

Commitment to Diversity and Inclusion

Pollen, Inc. (C2FO) believes that unique backgrounds and individual voices strengthen our team, leading to the best ideas and discoveries for our innovative and growing company. At C2FO, we seek, encourage, and nurture diverse perspectives, and we welcome those of all backgrounds to help us change the way global businesses of all sizes gain access to working capital.

As an organization, we not only value diversity and equality, we cultivate teams that feel empowered to bring their authentic selves to work every day. We strive to create a workplace that reflects the communities we serve and our global, multicultural clients. We recognize the power of inclusion, emphasizing that each team member was chosen for their unique ability to contribute to the overall success of our mission.

Data is the lifeblood of any technology company. At C2FO, we know the best way to ensure our success is to provide our decision-makers with the most accurate, most relevant data possible.

The Data DevOps Engineer will work with our Data Engineering team to deploy cutting-edge, open-source technologies across multiple clouds to collect, process, and store the company’s data. This engineer will work to automate and increase the efficiency of the CI/CD pipelines used by the Data Engineering team and develop comprehensive testing frameworks for end-to-end testing of big data ETL pipelines. The Data DevOps Engineer will also work closely with our Data Science and Machine Learning teams to provide infrastructure and automation for the building, testing, and deployment of Machine Learning models.

Responsibilities

  • Work closely with Data Engineers, Data Scientists, and Business Analysts to provision appropriate cloud infrastructure to meet the company's data storage and processing needs.

  • Collaborate with the Data Engineering and DevOps teams to deploy and maintain clustered computing across multi-cloud environments.

  • Design and build large-scale, automated ingestion pipelines that integrate across multi-cloud platforms and disparate data sources.

  • Optimize existing testing and deployment processes to improve the reliability and efficiency of code deployed by Data Engineering and Machine Learning teams.

Qualifications

Qualities we're looking for

  • Values infrastructure as code

  • Values team collaboration and success

  • Continuously learns new tools, technologies, and techniques

  • Mentors and learns from others by encouraging positive attributes and attitudes in others

  • Comfortable with ambiguity and has excellent written and communication skills

  • Ability to troubleshoot across multiple applications and infrastructure layers

Preferred Skills

  • Experience writing and deploying code written in a scripting language (preferably Python)

  • Experience deploying and maintaining technologies in the Hadoop/Big Data Ecosystem such as Apache Spark, Apache Kafka, etc.

  • Experience with relational database administration (bonus points for MPP databases such as Amazon Redshift).

  • Experience with and Linux/Unix operating systems.

  • Experience with containers and container orchestration

  • Experience with cloud-hosted computing.

  • Experience using a version control system and maintaining CI/CD pipelines.

Requirements

  • Bachelor’s degree in Computer Science or a related field.

  • 3 years of experience in the job offered or in a related position.

  • Applicants must have the legal authority to work in the United States.

Apply Now
When you apply, please mention that you found the posting on Diversify Tech!


Get jobs delivered to your inbox!

We'll let you know as soon as a job is posted on our job board.

Send me jobs →