Are you learning Neural Nets from scratch too?

Are you learning Neural Nets from scratch too?

I am considering a 2026 project where a neuron in neural network doesn’t use floating point. I understand floating point compute is where a lot of compute cost happens.

We would be working with discrete dynamical systems.

I’m retired and love these interesting ideas.

I can construct a dynamic neuron.

The thing for me is to learn how to make a neural network with it.

If this sounds like a friendship to you. PM Me.

-Ernst

2 Likes

Hi Ernst, that sounds like a fascinating deep-dive into efficiency. I’m currently fully committed to engineering a reliability layer for existing LLM architectures (Security Orchestration), so my focus is strictly on SFT and deploying Transformer-based models.

However, looking at papers like Microsoft’s BitNet b1.58 (1-bit LLMs), you are definitely hitting a nerve in the industry. Moving away from heavy floating-point math is the future. I’ll be cheering for you from the sidelines… Good luck! And thank you for reading my LLM Security Posts … :slight_smile:

1 Like

Yeah. BitNet https://github.com/microsoft/BitNet is already astonishing at the time for actually operating with 1.58-bit quantization…
It still isn’t easy to freely convert pre-existing model weights, but it’s just fascinating…