No. Decisions they do not make. Informally, the distinction does not really matter because most folks will suppose they know what you mean. But you wantObviously, that molecular program makes countless selective state-machine, algorithmic decisions — Sir Philo Sophia
And that's not going to happen with the language you're using and how you use it. But here I leave you to it..... a scientific definition. — Sir Philo Sophia
And that's not going to happen with the language you're using and how you use it. But here I leave you to it. — tim wood
Amen, and a small point. Anything can be defined by anyone anyway they choose to define it - whether any good a different topic. Insofar as the definition is a text intended to convey a definite meaning, wrt to that text the language matters, is in fact the first and arguably only thing that matters.
In literature is the concern for le mot juste, the right word. I imagine in the sciences as well, perhaps as the correct word. And do the sciences have their own phrase for that? — tim wood
This paper proposes a theory for understanding perceptual learning processes within the general framework of laws of nature. Neural networks are regarded as systems whose connections are Lagrangian variables, namely functions depending on time. They are used to minimize the cognitive action, an appropriate functional index that measures the agent interactions with the environment. The cognitive action contains a potential and a kinetic term that nicely resemble the classic formulation of regularization in machine learning. A special choice of the functional index, which leads to forth-order differential equations---Cognitive Action Laws (CAL)---exhibits a structure that mirrors classic formulation of machine learning. In particular, unlike the action of mechanics, the stationarity condition corresponds with the global minimum. Moreover, it is proven that typical asymptotic learning conditions on the weights can coexist with the initialization provided that the system dynamics is driven under a policy referred to as information overloading control. Finally, the theory is experimented for the problem of feature extraction in computer vision. — Cognitive Action Laws: The Case of Visual Features
I quibbled a bit and suggested this:anything which results in a change — Metaphysician Undercover
Outlander this:that which causes change — tim wood
And @SophistiCat instructs us all.Forward action (negentropy?): Stagnation (plateau, static positioning, Backward action (entropy): — Outlander
Once you define your Lagrangian (a mathematical object), then the definition of action follows straightforwardly from that. But how the Lagrangian is cached out in physical terms is going to vary from one theory to another. It is one thing in non-relativistic classical mechanics, another - in relativistic classical mechanics, yet another in quantum mechanics, etc. — SophistiCat
Here is a random example from the literature:
This paper proposes a theory for understanding perceptual learning processes within the general framework of laws of nature — SophistiCat
Clearly, there are no holonomic constraint equations possible for particles under "intelligent control" — Sir Philo Sophia
Seems obvious to me — Sir Philo Sophia
I am curious, is this your personal theory about animate matter, or did you read it somewhere? — SophistiCat
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.