"10k+ users" doesn't mean much, what's the actual expected amount of submissions and processing, how much data, how much time per calculations is required... Given how fast computers are at making calculations there's a fair chance a single server could handle the whole thing.
I don't see the use case for Edge functions for most projects anyway. Aren't you just better off processing the requests in one data center close to data and main business logic anyways? Then it's easier to throw something in front to handle rate limiting and such.
Those concepts are handled by backend and there are tons of established frameworks/solutions, some have been around for 20 years. I guess it's just that there's not much hype or marketing about those since they have solved everything ages ago and people just use them.
I generally recommend to stick to single server for a long time. Most services have barely hundred concurrent requests, likely not even that, basically idling 99% of the time. Of course you can use cdn and such since they are essentially free for static files. The important thing is to have proper architecture from the start so it's possible to refactor later without affecting rest of the codebase. Nothing fancy, just the usual of interface — implementation separation often does the job.
It's a bit strange yes since many backend frameworks come with auth, db integrations etc. built-in or easily pluggable.
The obvious question is why would someone run some cli tool made by someone to execute whatever code on their machine. If you can provide direct source files that can be copied to project then it's worth considering, otherwise just a hard no.
Any backend will do and they have these things ready. Laravel, django etc. are fine.
What's wrong with boring old-fashion setup, frontend on cdn/nginx or something and then run backend instance(s) as you wish. It's also cheap and no need for external services.
Avoid building around any third party solutions, be it code or services. Authentication is a separate isolated step in request processing that has a result e.g. authentication error or "this is user 12345 with role foo", then the rest of the processing just continues from there. How the authentication is implemented doesn't matter, use whatever as long it gives a standardised result. Often this is implemented as saving the data e.g. user object to the request context (or similar) which is then passed to business logic. And authorisation checks there are essentially just conditionals e.g. ensuring role or resource ownership.
I wouldn't involve nextjs in it since the architecture makes it unnecessarily complex. Using any normal backend framework has you covered and the solutions have been in use for 15 years, there's nothing new or interesting in those. Also defaulting to sessions is fine for most apps, they will never get past 500 requests per second to worry about **massive** scaling etc.
Separate stats for static files and then analytics at API level. Maybe Supabase is what makes it harder, definitely easier to handle with a "traditional" backend.
Running a DB and some backend framework should do it just fine. Not sure why people go for these external services when e.g. firing up a Laravel project gives all essentials already as local plain code.
Stick to fully client rendering for such use case. How I'd approach that is to create a robust API client that manages data loading, network and token renewals behind the scenes. No need for auth providers and such, just initialise and store the user status/role in e.g. localstorage/js accessible cookie and read it from there when needed e.g. for UI rendering. You can use ts query and such obviously and let it handle caching.
Well you need some profiling where the time is actually spent... If it's just a profile page I'd assume there's only e.g. token validation, pulling the id from it and a db query. All those should happen within milliseconds so there's definitely something wrong but can't say what...
Nextjs doesn't need to have access to anything sensitive, just use it for frontend/bff. Then run an actual backend and db in hosting environment thst suits the requirements best e.g. Canadian provider. No reason to use cloud services
One point that I agree is that often it's easier simply to block, get data and render instead of going into the suspense/skeleton/spinner stuff. You don't even need SQLite, regular sql server response times can be some as little as milliseconds if it's close. Sometimes it feels like lot of these hyped "performance optimizations" are more to hide bad performance than to make the actual work that needs to be done fast.
AWS or any standard backend works fine. Just fire up e.g. Laravel or Django project and you got most of the thing already set up.
postgres
I would have started with a custom backend server and db schema for this. WP and its plugins' db models are terribly inefficient. Obviously it means more development time but long term it's surely worth it.
Aren't you checking the user credentials and determining whether they have the right to call some feature X? To be honest I did not quite understand. Assume you have endpoint /api/foo. There you do an authorisation check to determine whether this user has the right for the e.g. get order details for order id 12345 or whatever. How could anyone bypass that unless your implementation is flawed...
I just use an external backend, any mature framework comes with pretty much everything necessary built-in
Which benefits are those in your use case? Storing user data in browser isn't necessarily bad, stuff like login status or username is known to the user anyway. Persisting it in browser allows to render correct UI immediately without network requests in case of reload for example.
What about just making it SPA... on load initialize user state, persist it e.g. on localstorage and just read the data when rendering.
Use what makes sense based on **your** actual requirements. What functionality is required, how is the load profile, performance requirements etc. Then start thinkin about language and stacks. For majority (?) of apps the load is so small they can run on $10 instance and it will be idling 99% of the time anyway. Personally I'd go with Django/Laravel or go for backend, especially first two basically has everything necessary as ready template with all local code. You get users,auth, db layer, admin dashboards etc. Then run whatever for frontend/bff.
I've been doing this crap for a long time and I don't see any reason to stray away from a simple working system : let your backend handle users and auth. It's where business logic happens and close to data as well. Any mature backend framework comes with auth, some even built-in so when you fire up a new project with e.g. Django you can just toggle auth on and it will create fearures, login/register routes, forgot password etc. features. Obviously you can add whatever providers you want. It's so weird to see people talking about auth all the time when it has been a solved problem for ages.
Better separate the backend side from next entirely. Frontend is basically free since most of it can be just static fiie hosting. Storage shouldn't be a problem, just run e.g. postgres and a cms of your choice.