Draft 1. 04/28/18. Updated twice as of 08/17/18. Updated again 02/04/19.
Rather than a Chomskyan approach to language, I find the use of relations, or ordered pairs, more pleasing.
Actually, this notion came to mind while reading Russell (one of his two mid-century books on epistomology; citation to come), where he talked about meaningless sentences. His position was that sentences tokenize facts, and those that don't are meaningless. I.e., there is a correspondence between a sentence and some event in spacetime if the sentence (proposition) is true; if there is a proposed correspondence to a fact that turns out not to be a spacetime event, then the sentence is false. Russell went into some subtleties about the logic of the not coupler, but those will not detain us.
Though I am sympathetic to the notion that sentences -- or, anyway, propositions -- represent or tokenize facts, it seems we must then cope with pseudo-facts, as in the pseudo-events of Hamlet and the pseudo-person Hamlet. How many ways must we slice "to be" and "not to be"? Russell found quite a few, but be that as it may, I highly recommend Russell's analysis, regardless of the fact that I prefer a different approach than he advised.
What is my approach? I consider a propositional sentence in English (and I suggest in any earthly human language) to be decomposable into sets of relations. (I have yet to read Carnap on this topic. Once I do, I may modify this essay accordingly.)
Let's begin with a complete sentence that lacks an object or other non-verb predicate.
Sally ran.
In this case we have the relation ran which pairs the subject, Sally, with nothing, or the null set.
So this gives sR∅ or run < s, ∅ >. Because s refers to a unique, non-variable (Sally), the matrix for this ordered pair contains only that pair. (In the interests of completeness, we note that the empty matrix is a subset of every relation, as in R <∅,∅>. Though one considers two elements to be "related" by some verb, a relation here means a set which includes the elements under the relation-word. So the set of all declarative sentence relations, for example, must have a null set as a subset, which would be ∪ Ri <∅,∅>.)
Note that in our method, the principle verb or (which can include a "composite verb" -- see below) serves as the relation.
A more common situation is given by sentences such as:
Sally ran home.
Sally ran fast.
Sally ran yesterday.
Sally ran for her life.
The object answers the question, "Where did she run?"
The adverb (non-verb predicate word) answers the question, "How did she run?"
The time element (non-verb predicate word) answers the question, "When did she run?"
The explainer answers the question "Why did she run?" So then we have for the relation ran a set A composed of all words suitable for a subject and a set B composed of all words that are formally usable as a non-verb predicate word. The set A X B is then the set of ordered pairs of subjects and non-verb predicates under the relation ran.
A X B contains any such pair and many of those pairs may not tokenize facts. The problem here is that suppose the "meaningless" sentence has some ineffable meaning in some poem somewhere? It seems more useful to say that there is a C ⊂ A X B which contains pairs that have a relatively high probability of tokenizing some idea, concept or fact familiar to many people. I have not mathematically quantified the terms "relatively high probability" or "many people" because a high degree of complexity is implied, though I suggest these areas can be got at via information theory concepts.
A sentence such as Sally ran home for her life yesterday can be handled as the relation R < a, x > where a is the constant Sally and x is a member of the set {home, for her life, yesterday}. To be precise, we should say write ∪ Ri < a,x >.
With this approach in mind we look at Chomsky's classic example:
Colorless green ideas sleep furiously.
This is the sort of proposition Russell would have termed meaningless because it represents no fact in our actual world. Yet, others would say that for the typical English speaker it resonates as formally correct even if silly. And what if it is part of a poem?
Colorless green ideas sleep furiously
when the sun goes down in Argotha,
a timeless, though industrious hamlet
which straddles the borderline
between Earth and Limbo
Umm, well, let's not stretch this too far, and get back to business.
We have the subject, Colorless green ideas. Here we can decompose the structure into ordered pairs thus:
The subject has the property, or attribute, colorless green. But we notice that the adjective colorless is inconsistent with the subject modifier green That is, we are unlikely to accept an ordered pair used as a subject modifier that is logically inconsistent. But, we might. For example, the writer might be trying to convey that the green was so dull as to metaphorically qualify as colorless.
So for the subject's modifiers we use the relation "has the property or attribute of," as in:
(cPg)Pi
The section in parentheses is the relation green has property of colorlessness which is part of the relation idea has the property of colorless green.
The full sentence is then {(cPg)Pi}Sf, where S is the relation sleep and furiously the adverb.
The ordered pair notation gives < < c,g >, i >, f >, where I have left the relation symbols implicit.
We note here that green can serve as either a noun or secondary modifier (adjective), but not so for colorless. So the form green colorless ideas requires an implied comma between green and colorless so as to indicate that each modifier modifies the subject as a primary modifier and not as a modifier that modifies a modifier. Of course, by altering the morphology of colorless so as to project a noun, colorlessness, we are able to show that the relation is, if so desired, reflexive, as in green colorlessness.
It will be objected, ideas don't sleep; people sleep. Yet, ideas do, it seems, percolate. An idea that is ahead of its time might very well said to be slumbering in the collective consciousness of the intelligentsia. If the idea is being suppressed for ideological reasons, it could be said to be persevering in an agitated, even furious sleep. So what we say of Chomsky's sentence is that it is structurable as a relation in a full matrix of ordered pairs, but that in the subset of routine ordered pairs it is not to be found. I have not defined "routine" -- again because this would require some spadework in information theory, which I have not done.
In any event, a relation for, say, a simple declarative sentence can be styled xRy, or R< x, y >. This compares to such notation as P(x,y), in which P represents a predicate and, in this case, x and y two terms. So the proposition 2 + 1 = 3 requires that "2 + 1 =" be the predicate, with 3 a term, or that "3 =" be the predicate with "2 + 1" a term. I.e., P tokenizes "2 + 1 =" and "3" is a term. We abandon this standard notation with preference to the more compact and logically succinct relation notation.
So then we are able to write R< x, y > in which the only required constant is the principle verb, which is the relation.
Of course, many sentences use specific or particular descriptives for subjects. Even the word "they" is usually implicitly particular. Generally, we have some idea of who is meant by the words, They laughed. In other words, the subject term of a relation pair will often be a constant. Yet, that set would be a subset of the set of ordered pairs where all variables are used for subject and non-verb predicate.
A point of which to be aware: Reflexivity obtains in the general matrix even if some of the pairs are deemed meaningless in Russell's sense. But reflexivity may very well not be acceptable in a "probabilistic" subset. Horses eat hay tokenizes a Russellian fact, but Hay eats horses will be cast into outer darkness.
This seems a good place to reflect on the quantifier all. In honor of Russell, we shall write "all x" as (x). If we want to say that the proposition P holds for all x we may write (x)P or (x)Px. Now how does this quantifier work out in our system of relations? I suppose we have to be aware of levels. We write R < x, y >, where the use of the letters x and y implies arbitrary elements of the entire cross product. So we may apply a truth value to the whole cross product, to a cross product subset or to a single ordered pair, in which each element is constant. This line of thought corresponds to the "all," "some" or "none" quantifiers.
So for Russell, Horses eat hay is a proper proposition with a truth value T, and Hay eats horses is also a proper proposition that carries a truth value of F. But, Ideas sleep furiously correlates with no fact and so is meaningless and so carries no truth value. But, to controvert Russell, we note that language entwines the art of metaphor, invention and novelty, and so we cannot be sure that a "meaningless" construct won't say something meaningful to someone sometime.
Let us note something further on quantification, while we're at it. Horses eat hay is generally accepted as true. But does this mean all horses eat hay? or perhaps all horses are inclined to eat hay? or maybe most horses will thrive on hay? I cast my vote for the last option. This sort of ambiguity prompted Russell to argue in favor of strict symbolic notation, which, it was hoped would remove it. From my point of view, the relation E < a,b > expresses a subset of ordered pairs of things that eat and things that are eaten, to wit the subset of horses and the set of hay (where elements might be individual shoots, or bales, or packets). So we may be claiming that T holds for this entire subset. Or, we may wish to apply a truth value only after inserting the standard quantifier "some," which is a noun or subject modifier. One may write ∃E< a,b > if one can endure the abuse of notation. The notion of most is rather convenient and seems to warrant a quantifier-like symbol. We will use ⥽ . Hence ⥽E means most elements of this set have the truth value T. Or, better, there is a subset of E< a,b > such that its complement contains at least one less member than it contains. By the way, use of the "exactly one" quantifier ∃ ! permits us to justify the "most" symbol, as the reader can easily work out for himself.
We should also account for transitivity.
Case 1) Different relations
Example: Horses eat hay, hay feeds some animals
or E< a,b > • F < b,c >
Case 2) Same relation
Example: Men marry women, women marry for security
or M < a,b > • M < b,c >
We also have symmetry.
Example: Women marry men, men marry women
M < a,b > <--> M < b,a >
Other examples of symmetry:
u = v
Al was forced to face Al
In this last example, notice that we permit the action element to be a composite where exist, is and be are regarded as actions. I.e., the relation is was forced to face.
I have not carefully analyzed such composites, as I am mostly interested in the fact that the action element relates a subject to something that is acted upon. It is the function that seems important to me, as opposed to the details of the interior of the function.
A quick look at the is or existential relation:
The king of France is bald,
which may be taken as equivalent to
The king of France has the property or attribute of baldness.
Associated with the is relation is a complete set A X B such that A contains all English nouns and B contains all English words used for properties. Aside from an adverb, one might have an object in the form of a horse or the concierge.
Russell in 1905 argued that, because the sentence implicitly asserts the proposition, "There at present exists a king of France and that king is bald," the sentence is false because the first clause in the rewritten proposition is false. Our view is that the subset of relations to which truth values are attached is somewhat malleable. I.e., as discussed above, something might be true in a very limited context while in general not being taken as true.
For example, suppose the sentence is part of a limerick: The king of France is bald, and also quite the cuckold... etc. One would not apply a truth value to the sentence in a general way, but for the case of "suspended disbelief" that we humans deploy in order to enjoy fictions, the sentence is held to be true in a very narrow sense. At any rate, we find that, though it has no general truth value, we cannot consign it to the set of "meaningless" sentences. We have managed to put it into a context that gives it meaning, if we mean by that word something beyond gibberish.
What of such counterfactual objects: as "the gold mountain" or "the round square"?
We add the implicit verb exists so as to obtain the relation:
gE∅ and rE∅
It is claimed that these objects do not exist and so fall under the subset of false propositions. That would be the conventional judgment. In the case of a gold mountain, we are talking about something that has never been observed, but there is always the faint possibility that such an object may be encountered. So in that case the ordered pair < g, ∅ > would perhaps be placed in the complement set of the truth value set.
As for "the round square," we are faced with a contradiction. We can make this absurdly plain by writing "A square object contains four right angles on its perimeter" and "A round (or circular) object has no finite angles on its perimeter or it has an infinitude of what some call infinitesimal angles.."
And that gives: "A perimeter with four right angles, which are represented by a finite number, is a perimeter with no finite number of angles." So there is no issue with placing the "round square exists" relation in the subset of relations deemed false.
Russell in a lecture published in 1918 objects to Meinong's way of treating the round circle as an object.
Rather than a Chomskyan approach to language, I find the use of relations, or ordered pairs, more pleasing.
Actually, this notion came to mind while reading Russell (one of his two mid-century books on epistomology; citation to come), where he talked about meaningless sentences. His position was that sentences tokenize facts, and those that don't are meaningless. I.e., there is a correspondence between a sentence and some event in spacetime if the sentence (proposition) is true; if there is a proposed correspondence to a fact that turns out not to be a spacetime event, then the sentence is false. Russell went into some subtleties about the logic of the not coupler, but those will not detain us.
Though I am sympathetic to the notion that sentences -- or, anyway, propositions -- represent or tokenize facts, it seems we must then cope with pseudo-facts, as in the pseudo-events of Hamlet and the pseudo-person Hamlet. How many ways must we slice "to be" and "not to be"? Russell found quite a few, but be that as it may, I highly recommend Russell's analysis, regardless of the fact that I prefer a different approach than he advised.
What is my approach? I consider a propositional sentence in English (and I suggest in any earthly human language) to be decomposable into sets of relations. (I have yet to read Carnap on this topic. Once I do, I may modify this essay accordingly.)
Let's begin with a complete sentence that lacks an object or other non-verb predicate.
Sally ran.
In this case we have the relation ran which pairs the subject, Sally, with nothing, or the null set.
So this gives sR∅ or run < s, ∅ >. Because s refers to a unique, non-variable (Sally), the matrix for this ordered pair contains only that pair. (In the interests of completeness, we note that the empty matrix is a subset of every relation, as in R <∅,∅>. Though one considers two elements to be "related" by some verb, a relation here means a set which includes the elements under the relation-word. So the set of all declarative sentence relations, for example, must have a null set as a subset, which would be ∪ Ri <∅,∅>.)
Note that in our method, the principle verb or (which can include a "composite verb" -- see below) serves as the relation.
A more common situation is given by sentences such as:
Sally ran home.
Sally ran fast.
Sally ran yesterday.
Sally ran for her life.
The object answers the question, "Where did she run?"
The adverb (non-verb predicate word) answers the question, "How did she run?"
The time element (non-verb predicate word) answers the question, "When did she run?"
The explainer answers the question "Why did she run?" So then we have for the relation ran a set A composed of all words suitable for a subject and a set B composed of all words that are formally usable as a non-verb predicate word. The set A X B is then the set of ordered pairs of subjects and non-verb predicates under the relation ran.
A X B contains any such pair and many of those pairs may not tokenize facts. The problem here is that suppose the "meaningless" sentence has some ineffable meaning in some poem somewhere? It seems more useful to say that there is a C ⊂ A X B which contains pairs that have a relatively high probability of tokenizing some idea, concept or fact familiar to many people. I have not mathematically quantified the terms "relatively high probability" or "many people" because a high degree of complexity is implied, though I suggest these areas can be got at via information theory concepts.
A sentence such as Sally ran home for her life yesterday can be handled as the relation R < a, x > where a is the constant Sally and x is a member of the set {home, for her life, yesterday}. To be precise, we should say write ∪ Ri < a,x >.
With this approach in mind we look at Chomsky's classic example:
Colorless green ideas sleep furiously.
This is the sort of proposition Russell would have termed meaningless because it represents no fact in our actual world. Yet, others would say that for the typical English speaker it resonates as formally correct even if silly. And what if it is part of a poem?
Colorless green ideas sleep furiously
when the sun goes down in Argotha,
a timeless, though industrious hamlet
which straddles the borderline
between Earth and Limbo
Umm, well, let's not stretch this too far, and get back to business.
We have the subject, Colorless green ideas. Here we can decompose the structure into ordered pairs thus:
The subject has the property, or attribute, colorless green. But we notice that the adjective colorless is inconsistent with the subject modifier green That is, we are unlikely to accept an ordered pair used as a subject modifier that is logically inconsistent. But, we might. For example, the writer might be trying to convey that the green was so dull as to metaphorically qualify as colorless.
So for the subject's modifiers we use the relation "has the property or attribute of," as in:
(cPg)Pi
The section in parentheses is the relation green has property of colorlessness which is part of the relation idea has the property of colorless green.
The full sentence is then {(cPg)Pi}Sf, where S is the relation sleep and furiously the adverb.
The ordered pair notation gives < < c,g >, i >, f >, where I have left the relation symbols implicit.
We note here that green can serve as either a noun or secondary modifier (adjective), but not so for colorless. So the form green colorless ideas requires an implied comma between green and colorless so as to indicate that each modifier modifies the subject as a primary modifier and not as a modifier that modifies a modifier. Of course, by altering the morphology of colorless so as to project a noun, colorlessness, we are able to show that the relation is, if so desired, reflexive, as in green colorlessness.
It will be objected, ideas don't sleep; people sleep. Yet, ideas do, it seems, percolate. An idea that is ahead of its time might very well said to be slumbering in the collective consciousness of the intelligentsia. If the idea is being suppressed for ideological reasons, it could be said to be persevering in an agitated, even furious sleep. So what we say of Chomsky's sentence is that it is structurable as a relation in a full matrix of ordered pairs, but that in the subset of routine ordered pairs it is not to be found. I have not defined "routine" -- again because this would require some spadework in information theory, which I have not done.
In any event, a relation for, say, a simple declarative sentence can be styled xRy, or R< x, y >. This compares to such notation as P(x,y), in which P represents a predicate and, in this case, x and y two terms. So the proposition 2 + 1 = 3 requires that "2 + 1 =" be the predicate, with 3 a term, or that "3 =" be the predicate with "2 + 1" a term. I.e., P tokenizes "2 + 1 =" and "3" is a term. We abandon this standard notation with preference to the more compact and logically succinct relation notation.
So then we are able to write R< x, y > in which the only required constant is the principle verb, which is the relation.
Of course, many sentences use specific or particular descriptives for subjects. Even the word "they" is usually implicitly particular. Generally, we have some idea of who is meant by the words, They laughed. In other words, the subject term of a relation pair will often be a constant. Yet, that set would be a subset of the set of ordered pairs where all variables are used for subject and non-verb predicate.
A point of which to be aware: Reflexivity obtains in the general matrix even if some of the pairs are deemed meaningless in Russell's sense. But reflexivity may very well not be acceptable in a "probabilistic" subset. Horses eat hay tokenizes a Russellian fact, but Hay eats horses will be cast into outer darkness.
This seems a good place to reflect on the quantifier all. In honor of Russell, we shall write "all x" as (x). If we want to say that the proposition P holds for all x we may write (x)P or (x)Px. Now how does this quantifier work out in our system of relations? I suppose we have to be aware of levels. We write R < x, y >, where the use of the letters x and y implies arbitrary elements of the entire cross product. So we may apply a truth value to the whole cross product, to a cross product subset or to a single ordered pair, in which each element is constant. This line of thought corresponds to the "all," "some" or "none" quantifiers.
So for Russell, Horses eat hay is a proper proposition with a truth value T, and Hay eats horses is also a proper proposition that carries a truth value of F. But, Ideas sleep furiously correlates with no fact and so is meaningless and so carries no truth value. But, to controvert Russell, we note that language entwines the art of metaphor, invention and novelty, and so we cannot be sure that a "meaningless" construct won't say something meaningful to someone sometime.
Let us note something further on quantification, while we're at it. Horses eat hay is generally accepted as true. But does this mean all horses eat hay? or perhaps all horses are inclined to eat hay? or maybe most horses will thrive on hay? I cast my vote for the last option. This sort of ambiguity prompted Russell to argue in favor of strict symbolic notation, which, it was hoped would remove it. From my point of view, the relation E < a,b > expresses a subset of ordered pairs of things that eat and things that are eaten, to wit the subset of horses and the set of hay (where elements might be individual shoots, or bales, or packets). So we may be claiming that T holds for this entire subset. Or, we may wish to apply a truth value only after inserting the standard quantifier "some," which is a noun or subject modifier. One may write ∃E< a,b > if one can endure the abuse of notation. The notion of most is rather convenient and seems to warrant a quantifier-like symbol. We will use ⥽ . Hence ⥽E means most elements of this set have the truth value T. Or, better, there is a subset of E< a,b > such that its complement contains at least one less member than it contains. By the way, use of the "exactly one" quantifier ∃ ! permits us to justify the "most" symbol, as the reader can easily work out for himself.
We should also account for transitivity.
Case 1) Different relations
Example: Horses eat hay, hay feeds some animals
or E< a,b > • F < b,c >
Case 2) Same relation
Example: Men marry women, women marry for security
or M < a,b > • M < b,c >
We also have symmetry.
Example: Women marry men, men marry women
M < a,b > <--> M < b,a >
Other examples of symmetry:
u = v
Al was forced to face Al
In this last example, notice that we permit the action element to be a composite where exist, is and be are regarded as actions. I.e., the relation is was forced to face.
I have not carefully analyzed such composites, as I am mostly interested in the fact that the action element relates a subject to something that is acted upon. It is the function that seems important to me, as opposed to the details of the interior of the function.
A quick look at the is or existential relation:
The king of France is bald,
which may be taken as equivalent to
The king of France has the property or attribute of baldness.
Associated with the is relation is a complete set A X B such that A contains all English nouns and B contains all English words used for properties. Aside from an adverb, one might have an object in the form of a horse or the concierge.
Russell in 1905 argued that, because the sentence implicitly asserts the proposition, "There at present exists a king of France and that king is bald," the sentence is false because the first clause in the rewritten proposition is false. Our view is that the subset of relations to which truth values are attached is somewhat malleable. I.e., as discussed above, something might be true in a very limited context while in general not being taken as true.
For example, suppose the sentence is part of a limerick: The king of France is bald, and also quite the cuckold... etc. One would not apply a truth value to the sentence in a general way, but for the case of "suspended disbelief" that we humans deploy in order to enjoy fictions, the sentence is held to be true in a very narrow sense. At any rate, we find that, though it has no general truth value, we cannot consign it to the set of "meaningless" sentences. We have managed to put it into a context that gives it meaning, if we mean by that word something beyond gibberish.
What of such counterfactual objects: as "the gold mountain" or "the round square"?
We add the implicit verb exists so as to obtain the relation:
gE∅ and rE∅
It is claimed that these objects do not exist and so fall under the subset of false propositions. That would be the conventional judgment. In the case of a gold mountain, we are talking about something that has never been observed, but there is always the faint possibility that such an object may be encountered. So in that case the ordered pair < g, ∅ > would perhaps be placed in the complement set of the truth value set.
As for "the round square," we are faced with a contradiction. We can make this absurdly plain by writing "A square object contains four right angles on its perimeter" and "A round (or circular) object has no finite angles on its perimeter or it has an infinitude of what some call infinitesimal angles.."
And that gives: "A perimeter with four right angles, which are represented by a finite number, is a perimeter with no finite number of angles." So there is no issue with placing the "round square exists" relation in the subset of relations deemed false.
Russell in a lecture published in 1918 objects to Meinong's way of treating the round circle as an object.
Meinong maintains that there is such an object as the round square only it does not exist, and it does not even subsist, but nevertheless there is such an object, and when you say “The round square is a fiction,” he takes it that there is an object “the round square” and there is a predicate “fiction.” No one with a sense of reality would so analyze that proposition. He would see that the proposition wants analyzing in such a way that you won’t have to regard the round square as a constituent of that proposition. To suppose that in the actual world of nature there is a whole set of false propositions going about is to my mind monstrous. I cannot bring myself to suppose it. I cannot believe that they are there in the sense in which facts are there. There seems to me something about the fact that “Today is Tuesday” on a different level of reality from the supposition “That today is Wednesday.” When I speak of the proposition “That today is Wednesday” I do not mean the occurrence in future of a state of mind in which you think it is Wednesday, but I am talking about the theory that there is something quite logical, something not involving mind in any way; and such a thing as that I do not think you can take a false proposition to be. I think a false proposition must, wherever it occurs, be subject to analysis, be taken to pieces, pulled to bits, and shown to be simply separate pieces of one fact in which the false proposition has been analyzed away. I say that simply on the ground of what I should call an instinct of reality. I ought to say a word or two about “reality.” It is a vague word, and most of its uses are improper. When I talk about reality as I am now doing, I can explain best what I mean by saying that I mean everything you would have to mention in a complete description of the world; that will convey to you what I mean. Now I do not think that false propositions would have to be mentioned in a complete description of the world. False beliefs would, of course, false suppositions would, and desires for what does not come to pass, but not false propositions all alone, and therefore when you, as one says, believe a false proposition, that cannot be an accurate account of what occurs.
I suppose Russell preferred a different definition of "object" over Meinong's.
But our way of treating the proposed object shows that no such object is "acceptable" because of the internal contradiction, which makes the relation "A/the round square exists" an element of the falsehood subset of R X ∅
Point to Russell.
Russell saw the value of the use of relations for linguistic purposes as far back as 1918, as we can see from this excerpt from The Philosophy of Logical Atomism, Lecture V:
But our way of treating the proposed object shows that no such object is "acceptable" because of the internal contradiction, which makes the relation "A/the round square exists" an element of the falsehood subset of R X ∅
Point to Russell.
Russell saw the value of the use of relations for linguistic purposes as far back as 1918, as we can see from this excerpt from The Philosophy of Logical Atomism, Lecture V:
Now I want to come to the subject of completely general propositions and propositional functions. By those I mean propositions and propositional functions that contain only variables and nothing else at all. This covers the whole of logic. Every logical proposition consists wholly and solely of variables, though it is not true that every proposition consisting wholly and solely of variables is logical. You can consider stages of generalizations as, e.g.,
“Socrates loves Plato” “x loves Plato” “x loves y” “x R y.”
There you have been going through a process of successive generalization. When you have got to xRy, you have got a schema consisting only of variables, containing no constants at all, the pure schema of dual relations, and it is clear that any proposition which expresses a dual relation can be derived from xRy by assigning values to x and R and y. So that that is, as you might say, the pure form of all those propositions. I mean by the form of a proposition that which you get when for every single one of its constituents you substitute a variable. If you want a different definition of the form of a proposition, you might be inclined to define it as the class of all those propositions that you can obtain from a given one by substituting other constituents for one or more of the constituents the proposition contains. E.g., in “Socrates loves Plato,” you can substitute somebody else for Socrates, somebody else for Plato, and some other verb for “loves.” In that way there are a certain number of propositions which you can derive from the proposition “Socrates loves Plato,” by replacing the constituents of that proposition by other constituents, so that you have there a certain class of propositions, and those propositions all have a certain form, and one can, if one likes, say that the form they all have is the class consisting of all of them. That is rather a provisional definition, because as a matter of fact, the idea of form is more fundamental than the idea of class. I should not suggest that as a really good definition, but it will do provisionally to explain the sort of thing one means by the form of a proposition. The form of a proposition is that which is in common between any two propositions of which the one can be obtained from the other by substituting other constituents for the original ones. When you have got down to those formulas that contain only variables, like xRy, you are on the way to the sort of thing that you can assert in logic.
To give an illustration, you know what I mean by the domain of a relation: I mean all the terms that have that relation to something. Suppose I say: “xRy implies that x belongs to the domain of R,” that would be a proposition of logic and is one that contains only variables. You might think it contains such words as “belong” and “domain,” but that is an error. It is only the habit of using ordinary language that makes those words appear. They are not really there. That is a proposition of pure logic. It does not mention any particular thing at all. This is to be understood as being asserted whatever x and R and y may be. All the statements of logic are of that sort.
It is not a very easy thing to see what are the constituents of a logical proposition. When one takes “Socrates loves Plato,” “Socrates” is a constituent, “loves” is a constituent, and “Plato” is a constituent. Then you turn “Socrates” into x, “loves” into R, and “Plato” into y. x and R and y are nothing, and they are not constituents, so it seems as though all the propositions of logic were entirely devoid of constituents. I do not think that can quite be true. But then the only other thing you can seem to say is that the form is a constituent, that propositions of a certain form are always true: that may be the right analysis, though I very much doubt whether it is.
There is, however, just this to observe, viz., that the form of a proposition is never a constituent of that proposition itself. If you assert that “Socrates loves Plato,” the form of that proposition is the form of the dual relation, but this is not a constituent of the proposition. If it were you would have to have that constituent related to the other constituents. You will make the form much too substantial if you think of it as really one of the things that have that form, so that the form of a proposition is certainly not a constituent of the proposition itself. Nevertheless it may possibly be a constituent of general statements about propositions that have that form, so I think it is possible that logical propositions might be interpreted as being about forms.
Russell gave a discussion of the philosophy of relations in The Principles of Mathematics (1903) in which he contrasted the "monadistic" oulook with the "monistic." The monistic view that a relation was all of a piece with the relata (the objects of the relation). For example, if we regard "x is greater than y" as a relation, we find that there is a relation between the string within quotation marks and the constituents x and y, which, says Russell, leads to contradiction. He favors a string written "x is greater than" (from what I can gather), but this form has its own problems.
My form requires that nouns or gerunds or noun or gerund phrases are paired by a relation, which, admittedly, we have not defined very well. I would argue that the relation is either a verbial (action or pseudo-action as in existence) word or phrase. For example, the relation "greater than" may be finely elucidated in NBG set theory, or it may be left as a primitive concept.
My form requires that nouns or gerunds or noun or gerund phrases are paired by a relation, which, admittedly, we have not defined very well. I would argue that the relation is either a verbial (action or pseudo-action as in existence) word or phrase. For example, the relation "greater than" may be finely elucidated in NBG set theory, or it may be left as a primitive concept.
No comments:
Post a Comment