Familiar with Figma and UI/UX web frame design concepts
MERN Development
Next.js; frameworks and templates for UI development
React.js for localState mangagement and building reusable components and integration w/ Virtual DOM
React.js router library for handling routing using <Link> & <Route>
Angular.js for Front-End Develpoment
Familiar with Framer Motion, Three.js and Anime.js
Redux- for global state mangement
Passport.js, JWT and bcyrpt.js for authentication and security.
Typescript
API development and backend architecture w/ Node.js & Express & Mongoose
Version Control with Git
SCRUM Development & Agile Development & Problem Solving
Mocha, Chai, Jasmine Testing
Postman and cURL for testing API, debugging requests, and troubleshooting connectivity issues
Tailwind CSS; component based UI development + HTML5.3 + Bootstrap + Chakra UI
AJAX and other HTTP requests inlcuding POST, PUT, DELETE & PUT
CORS- enabling cors poilcy to connect specific domains with one another
Familiar w/ libraries with Chart.js, D3.js, Plotly of visualizing data in pie charts, line charts and gauges
Databases
MongoDB + Mongoose, Mongo Compass & Atlas
Vector Database Development
MySQL Database
Seeding data using Mongoose
Sharding, Cleaning, Integrating data into Pipelines
Designed and implemented MongoDB schemas for a product catalog and user management.
Integrated MongoDB with Express.js for handling API requests (CRUD operations).
Used Mongoose ODM for schema validation and query optimizations.
Implemented indexing and aggregation pipelines to improve query performance.
Worked with MongoDB Compass for database inspection and debugging.
Seeded databases using scripts to populate test and data.
Data propogation of the process of transferring, updating, and synchronizing data across different systems, databases, or components in a distributed environment.
Python
Getting familiar with:
~ Object-Oriented Programming and fundamentlas of syntax, data structures, lists, dictionaries, tuples, sets and functions
~ Conccurency/Paralellism: familiarity with threading, multiprocessing, and asyncio for handling paralell computation tasks; especially for high-performance computing in nueral networks
Machine Learning Frameworks
~ TesnorFlow and Keras for deep learning models. Nueral Networks like CNNs, RNNs and GANs
~ PyTorch-deep learning framework
~ Scikit-learn for regression, clustering, and classification
Data Handling and Analysis
~ NumPy- for numerical operations, manipulating arrays and matrices for large scale neural data and image processing
~ Pandas for structured data sets CSV, Excel and JSON. Data manipulation tasks like filtering, grouping and joining datasets.
~ SciPy:For scientific computing optimization, linear algebra, stastics and numerical routines
~ Matplotlib/Seaborn: for creating static, animated and interactive plots. Used for visualizing nueral data and model performance.
Deep Learning Specializing
~ Keras: Building and Training deep learning models
~ Fast.ai: For training deep learning models
~ OpenCV: used for computer vision tasks; for processing and analyzing brain imaging data like MRI and CT scans
~ Reinforcement Learning (RL): Learn RL algorithms i.e., Q-learning, policy gradient methods, Deep Q Networks DQN, as RL is often used for training agents.
~ OpenAI Gym: for designing and training reinforcement learning algorithms in a simulated environment
~ TensorFlow Agents or RLlib: for reinforcement learning specifically
Neuroimaging and Signal Processing
~ MNE-Python: for analyzing brain signal and working brain data and time frequency analysis, source localization and visualization of brain activity
~ NIfTI: familiarity with NIfTI image format for storing nueroimaging, and tools like Nibabel for reading and writing NIfTi files
~ PyDICOM: working with DICOM files, files used to extract and manipulate imaging data.
~ SciPy and Singal Processing: For processing signals and performing tasks like filtering, fourier transforms and wavelet analysis
Deployments & Cloud Computing
Cloud and Infrastructure
AWS/GCP/Azure: familiraity with cloud Platforms for manaiging resources, scaling and deploying machine learning models and data pipelines
Docker: Containerization for running applications in isolated environments, useful for deploying models and maintaining a consistent development environment.
Kubernetes: For managing containerized applications at scale, especially useful for machine learning model deployment and handling large-scale computational workloads.
Apache Kafka: For real-time data streaming and processing of large volumes of data, especially in high-frequency neural data analysis.
Experimentation & Collaboration Tools
Jupyter Notebooks:for interactive data analysis, experimentation and prototyping, especially for research and data exploration.
Wegihts & Biases: A tool for tracking machine learning experiments and visualizing training runs and performance metrics.
MLflow: Tracking machine learning experiments, models and results.