This page will list my projects and things I’ve built.
NameSweep is a Python-based OSINT tool I built to search the web for information tied to a specific name or phrase. It aggregates results from multiple search engines and compiles them into a structured dataset for analysis. The system is designed to run efficiently, providing live progress updates and clean JSON output. I later expanded the project into an AI‑enhanced version that doesn’t just collect URLs but analyzes full webpage text for meaningful hits. This upgraded pipeline uses lightweight local AI models to classify relevance and filter out noise. It automatically summarizes useful pages, extracts key entities, and organizes findings into clear categories. The goal was to create a scalable, privacy‑friendly research assistant that runs entirely on my own machine. The project emphasizes modular design, fast iteration, and practical OSINT workflows. It also reflects my broader interest in automation, information retrieval, and intelligent data processing.
I built an automated gameplay system for the game Ball X Pit that combines virtual controller input with scripted logic to achieve extremely deep runs. The project started as an experiment in simulating player movement, but evolved into a fully autonomous loop that handles movement, timing, and reactor‑skip interactions. I engineered a custom Python script that uses virtual gamepad signals to continuously traverse the map while triggering precise inputs at the right moments. The system is designed to run indefinitely without user intervention, maintaining stable behavior even over multi‑hour sessions. Through iterative tuning, I optimized the timing, deadzones, and input cadence to maximize depth progression. The automation proved highly effective, achieving a recorded depth of 24,542—far beyond any publicly documented run. This project demonstrates my ability to reverse‑engineer game mechanics, design reliable automation loops, and build tools that interact with real‑time systems. It also reflects my interest in experimentation, performance tuning, and pushing technical boundaries for fun. The end result is a robust, hands‑off gameplay engine capable of running the game far longer and more consistently than a human player.
I built a geolocation lookup tool that automatically identifies properties within the Hays County Appraisal District (Hays CAD) using only GPS coordinates. The system takes raw latitude and longitude data and converts it into parcel information, ownership records, and property metadata. I designed the workflow to handle noisy or imprecise GPS inputs by snapping coordinates to the nearest valid parcel boundary. The backend integrates multiple public datasets and normalizes them into a consistent, queryable format. To keep the tool fast, I implemented spatial indexing and caching so repeated lookups resolve instantly. The project also includes a clean API layer that other scripts or devices can call, making it easy to integrate into field tools or automation pipelines. It was originally built to support location‑based tasks where manual address lookup would be too slow or error‑prone. The system proved highly reliable in real‑world testing, even in rural areas with inconsistent mapping data. This project highlights my ability to work with geospatial data, public records, and coordinate‑based search logic.
I built a Python-based Canvas Checker that automatically retrieves my course data, grades, and missing assignments from the Canvas LMS API. The tool authenticates using a personal access token and pulls structured information for every enrolled course. It parses assignments, due dates, submission status, and grading details into a clean, human‑readable summary. I designed the system to run quickly and handle API rate limits gracefully, even when fetching large datasets. The script also highlights overdue or missing assignments so they’re impossible to overlook. To keep the workflow flexible, I built modular functions for courses, assignments, submissions, and grade calculations. The tool can run on-demand or be integrated into other automation pipelines, including wearable devices or dashboards. It was originally created to eliminate the need to manually check multiple Canvas pages every day. The project demonstrates my ability to work with REST APIs, authentication, and structured data processing.