• 0 Posts
  • 26 Comments
Joined 10 months ago
cake
Cake day: June 7th, 2024

help-circle





  • Years ago. Client on-site meeting had run long past lunch. Was in a hurry to drive back home and beat the traffic. It was 1-hour normally, but 2-3x during rush hour.

    Saw a sign for a ‘natural’ market. Pulled in. They had an open-face cooler with prepackaged foods and drinks. Sandwiches looked a bit stale. Grabbed a ‘Fresh Vietnamese Shrimp Spring Roll’ and a drink. Hopped on the freeway. Ate in the car.

    Never Again.

    PS: Still got stuck in traffic.















  • So… if you own an inexpensive Alexa device, it just doesn’t have the horsepower to process your requests on-device. Your basic $35 device is just a microphone and a wifi streamer (ok, it also handles buttons and fun LED light effects). The Alexa device SDK can run on a $5 ESP-32. That’s how little it needs to work on-site.

    Everything you say is getting sent to the cloud where it is NLP processed, parsed, then turned into command intents and matched against the devices and services you’ve installed. It does a match against the phrase ‘slots’ and returns results which are then turned into voice and played back on the speaker.

    With the new LLM-based Alexa+ services, it’s all on the cloud. Very little of the processing can happen on-device. If you want to use the service, don’t be surprised the voice commands end up on the cloud. In most cases, it already was.

    If you don’t like it, look into Home Assistant. But last I checked, to keep everything local and not too laggy, you’ll need a super beefy (expensive) local home server. Otherwise, it’s shipping your audio bits out to the cloud as well. There’s no free lunch.