CK.

Hi, I'm Calvin Kinateder.

Software Developer

I'm a computer science student and software engineer focused on building applications that are perfectly designed from the inside out. No matter what I'm working on, my vision is for an equal focus on both the developer and the user. I received my BS and MS in Computer Science from the University of Cincinnati in 2025.

Experience

Siemens

Software Developer Co-op

Jan 2022 – August 2024

Researched cutting-edge technologies via academic papers to explore how they could be used for profit

  • Python
  • PyTorch
  • Tensorflow
  • Docker
  • Bash

Skyward LTD

Computer Engineer Co-op

Jun 2021 – Aug 2021

Implemented real-time inference for anomaly detection on incoming aircraft ADS-B transmissions

  • C++
  • Golang
  • Tensorflow
  • TensorRT
  • Docker

CoverMyMeds

Software Development Intern

Jun 2019 – Aug 2019

Created monitoring applications to observe production and development environments

  • Python
  • HTML
  • JavaScript
  • Ruby on Rails
  • Agile

Projects

flower-dp

Fully functional boilerplate for differentially private federated learning

  • Python
  • PyTorch
  • Tensorflow
  • Docker

blackswan-mvp

Semi-HFT stock trading bot built around the FinnHub and Alpaca trading APIs

  • Python
  • Sklearn
  • Docker

stegosaurus-midi

Embedded MIDI controller built on the RP2040 + Arduino platform for audio device manipulation

  • C++
  • JavaScript
  • WebMIDI
  • UNIX

Master's Thesis

A Novel Approach To Implementing Knowledge Distillation In Tsetlin Machines

Abstract

The Tsetlin Machine (TM) is a propositional logic based model that uses conjunctive clauses to learn patterns from data. As with typical neural networks, the performance of a Tsetlin Machine is largely dependent on its parameter count, with a larger number of parameters producing higher accuracy but slower execution. Knowledge distillation in neural networks transfers information from an already-trained teacher model to a smaller student model to increase accuracy in the student without increasing execution time. We propose a novel approach to implementing knowledge distillation in Tsetlin Machines by utilizing the probability distributions of each output sample in the teacher to provide additional context to the student. Additionally, we propose a novel clause-transfer algorithm that weighs the importance of each clause in the teacher and initializes the student with only the most essential data. We find that our algorithm can significantly improve performance in the student model without negatively impacting latency in the tested domains of image recognition and text classification.

Skills

  • Python
  • C++
  • C
  • Java
  • JavaScript
  • HTML
  • CSS
  • Golang
  • Git
  • Bash
  • Linux
  • MySQL
  • MATLAB
  • PyTorch
  • CUDA
  • TensorRT
  • TensorFlow
  • API Development
  • Computer Vision
  • Docker
  • Algorithms
  • Data Structures
  • Federated Learning
  • Transformers
  • Differential Privacy
  • Circuitry
  • CSG
  • LLMs
  • Math Logic

Contact

Email MeSchedule a Meeting