CLI configuration depends on a text file on the server, and enjoys user permissions as configured by an admin. The same goes for the web server.
yes, and as I wrote before, they are different, and that is what counts here. There is no need to maintain two environments, just stick to one.
There are plenty of ways of messing up the security of a web server config (just like the ones you mentioned for messing up CLI configs).
how is this relevant
cron can run perfectly well on a container, it’s lightweight, it’s standard Unix stuff. It’s a basic tool really.
Yes for rookies, who do not understand the container/task concept.
The idea of a task running in the container is that the orchestrator monitors it, when it it is down, it will spawn new task in the container environment.
It monitors the main task of your entrypoint.sh file. Rookies just start multiple tasks in a single container, which obviously are not being monitored. Thus if these (your cron) crashes nothing is being noticed as your webapplication is seen to be running. That is why in OC they have designed eg. pod architecture where multiple processes belonging together are deployed and monitored.
So running a local cron requires you to deploy a separate instance of suitecrm, where you monitor the cron. Which is ridiculous if you already have a cron task running for other services.
In the future your suitecrm container image should not even include an OS.
Typically I look at things from the point of view of administering our own Linux machines, not shared hostings.
That is incorrect start, your personal preference is of no relevance. You should check what most clients are using, or what goal for this has been defined. I think I even read somewhere on the suitecrm website that they are proud everything is in php and symphony so it is available to a large audience.
Currently you have to add to the website “SuiteCrm does not run in secured shared environments, if it does run in a shared environment it is probably not secure.”
It’s a no-brainer to run local jobs from a local service, instead of asking a computer on the Internet to call me up to tell me what I should do in my own server. Why create dependencies on external servers and on network connectivity?
This is not really an argument. You can also run in local cron a wget to a site that is running locally. That is most secure.
And even on shared hostings, the admins will lock things down (sometimes excessively, but ok, it’s their choice) and you work from within that. They typically provide ways for you to do some basic crontab entries; and the level of user access they give you on the command-line is quite low, you can’t really do much.
So this is an argument for keeping the cron in the web application.
We don’t need command-lines and CLI PHP engines because “it’s taking the easy way”. We simply use them because local things should be done locally, and a web server is not designed for local things.
What a non-sense, sorry this totally does not make sense. Give me one example.
Opening up SuiteCRM cron jobs to the Internet is actually exposing yourself to really easy denial-of-service attacks, and other mischiefs.
Also not an argument, you can easily secure calls to the cron.php.
Every time this issue has come up here in the forums, I always recommend people just get their crontab configurations right, and forget about external cron services.
You should not be advising on this. I do not think you have a broad understanding on system administration. Constantly I run into web developers that think they need to instruct sysadmins. You should stick with your environment creating the application. If you want advise on security, go ask security expert. Only advise on the area you are an expert in.
And if your shared hosting doesn’t let you access a command-line (only the cheapest ones have that limitation), that’s a great reason to drop it and choose another one.
Do you not start to wonder if you are wrong, when you get such signals?
I’m not against shared hostings, or even against limited shared hostings, they can be great for many things, and even free ones can be valuable tools. But for SuiteCRM - I don’t advise them at all, and never if they don’t give you SSH access.
How many suitecrm users are logging in with ssh to check things? 1%?
SuiteCRM is too complex, it’s a full enterprise app with over a million lines of code.
So why make it more complex with having to secure 2 environments, and not just stick with one? You still did not give a single argument why the cli is necessary for suitecrm.
It’s also temperamental and buggy, and I need to see things happening under the hood, look for files, grep logs, etc.
Even more reason to keep things simple stick to 1 environment not 2. Suitecrm is just a web application, keep things simple, than it is easier to maintain a higher level of security.
A year ago I looked at opencrx, these guys really seem to be experts, at least their database looks best designed from what I have currently seen.
The gui looks shit, but their business model is, I think, selling customization of the interface.
I really do not get why all these opensource projects are just starting from scratch and don’t look to participate on with others.
If these opencrx guys are good in db design, just cooperate and use that as backend.