1
   

Number - the brutal facts

 
 
Reply Sat 16 Jul, 2005 03:26 pm
A number is not a number if it falls outside a mathematical application. Numbers are not transferable between applications. We cannot then, strictly speak of 'a' number: numbers require an application, and the application will state a specific number, and not any number.

Numbers are never equal to each other. When we employ equations we use the equal sign to show us various forms or patterns of a single application. This is not equality. 5 = 5 does not tell us that five is equal to five. It shows us a view of a single application. An equation is, therefore, a pattern recognition tool. An equation will not give results; it will not tell us when to finish. Instead, we use the equals sign to distinguish between patterns made from a mathematical application.
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Discussion • Score: 1 • Views: 1,961 • Replies: 33
No top replies

 
ebrown p
 
  1  
Reply Sat 16 Jul, 2005 04:06 pm
I disagree. There is truth to numbers outside of any application.

The identity 5=5 is not that interesting... let's make it a bit more interesting with the equation 3 + 2 = 5.

This equation is true irregardless of the application. I can get this result in any number of ways... any number of applications. I will come up with the same answer no matter what the application. This implies that there is some underlying mathematical truth.

1) I can start at 3 on a number line and move two units in the positive direction.

2) I can buy 2 apples when I already have 3 in my posession.

3) I can send two quanta of energy to an electron cloud.

In all of these very different "applications" the math is the same, the number is the same and the operation is the same.

The number 5 has meaning as its own mathematical entity. It is useful when it is given context in an "application" but this is unnecessary. Everything you can do to a five as part of an application, you can do to it as a "abstract" number.

Likewise the operation "+" has an mathematical meaning outside of any meaning thrust on it by an external context. 3 + 2 works the same in any application you give to it. And - if you don't mess with the operation (i.e. I am talking about the plus operation, not the defined symbol) the result will be the same.

The equals sign is a bit tricky, but only because there are several different mathematical operations that we have discovered, and stupidly we named them all "equals" (and use the same sign).

But each number, and each of the basic mathematical operations stand alone. They are useful in applications, but the applications don't matter.

Mathematics stands alone as intrinsic truth.
0 Replies
 
Thalion
 
  1  
Reply Sat 16 Jul, 2005 06:28 pm
I'm going to have to disagree too. Math is both purely theoretical and practical. I personally love the theory and understanding how it works, while dislike doing problems. One of those few people who actually enjoy proofs (of general formulas, not the "proofs" that they call geometry problems which are only particular cases that apply formulas/rules).
0 Replies
 
John Jones
 
  1  
Reply Sun 17 Jul, 2005 03:07 am
Thalion wrote:
I'm going to have to disagree too. Math is both purely theoretical and practical. I personally love the theory and understanding how it works, while dislike doing problems. One of those few people who actually enjoy proofs (of general formulas, not the "proofs" that they call geometry problems which are only particular cases that apply formulas/rules).


Your english is a little hard to follow. I could not understand your last sentence, and I missed your general point altogether.
0 Replies
 
John Jones
 
  1  
Reply Sun 17 Jul, 2005 03:21 am
ebrown_p wrote:
I disagree. There is truth to numbers outside of any application.

The identity 5=5 is not that interesting... let's make it a bit more interesting with the equation 3 + 2 = 5.

This equation is true irregardless of the application. I can get this result in any number of ways... any number of applications. I will come up with the same answer no matter what the application. This implies that there is some underlying mathematical truth.

1) I can start at 3 on a number line and move two units in the positive direction.

2) I can buy 2 apples when I already have 3 in my posession.

3) I can send two quanta of energy to an electron cloud.

In all of these very different "applications" the math is the same, the number is the same and the operation is the same.

The number 5 has meaning as its own mathematical entity. It is useful when it is given context in an "application" but this is unnecessary. Everything you can do to a five as part of an application, you can do to it as a "abstract" number.

Likewise the operation "+" has an mathematical meaning outside of any meaning thrust on it by an external context. 3 + 2 works the same in any application you give to it. And - if you don't mess with the operation (i.e. I am talking about the plus operation, not the defined symbol) the result will be the same.

The equals sign is a bit tricky, but only because there are several different mathematical operations that we have discovered, and stupidly we named them all "equals" (and use the same sign).

But each number, and each of the basic mathematical operations stand alone. They are useful in applications, but the applications don't matter.

Mathematics stands alone as intrinsic truth.


If a number is an integral part of an application, made when the application is applied, then we need not imagine that a number is pulled out for service from a pre-conceived group of numbers. Besides which, we have no means of organizing a pre-conceived group of numbers without executing another application.
We must consider that the 'pre-conceived group' of numbers is not a group of numbers, but a group of numerals. I would define these as signs without an application. As soon as we put them into an application then they become numbers, but not before.
Finally, even if we have a pre-conceived group of numbers, it is difficult to see how this can allow us to claim that mathematics 'stands alone as intrinsic truth'. For this claim to be made, numbers must be a priori and discoverable.
0 Replies
 
Cyracuz
 
  1  
Reply Sun 17 Jul, 2005 05:18 am
Numbers are part of a system devised entirely by humans, for humans. Therefore every aspect of it is entirely within our grasp and understanding. The truth of it does not rely on the system itself. It is faith that sustains it. We believe it to be true, nothing more. So numbers are digits that we have given names and value. They signify something, thus forming a rich language for exploring further.
0 Replies
 
ebrown p
 
  1  
Reply Sun 17 Jul, 2005 06:58 am
Quote:

Numbers are part of a system devised entirely by humans, for humans.


I think this is clearly untrue.

JJ makes the correct distinction between numerals (the symbols) and numbers (the idea behind the symbols). The numeral 5, the "khamsa" (an arabic numeral that looks like a circle and means five, the Roman numeral "V" are all different numerals. But they refer to the same number that was discovered independently by countless very different cultures.

Quote:
We must consider that the 'pre-conceived group' of numbers is not a group of numbers, but a group of numerals. I would define these as signs without an application. As soon as we put them into an application then they become numbers, but not before.
Finally, even if we have a pre-conceived group of numbers, it is difficult to see how this can allow us to claim that mathematics 'stands alone as intrinsic truth'. For this claim to be made, numbers must be a priori and discoverable.


You have this backwards.

Numbers are "a prioriori and discoverable". Intelligent beings invent "numerals" in order to express the numbers they discover, and humans have invented many different types of numerals. Countless cultures have discovered the number 5, and invented different ways to represent it.

I can also give you a number that has no practical purpose in any application Let's take 2^1721443678341-63.

This string of characters is a "numeral" in that it is a way to express a number that is understandable by educated 21st century humans. But this represents a number, an idea that could be expressed by intelligent beings in any set of numerals they have devised.

This number is not part of an application... I believe it is a number bigger than the number of atoms in the Universe. But this does not make it any less of a number.

The fact is, I can do anything with this number that I can do with the number 5. I can add it to another number, I can subtract it, I can check if it is equal to another number... I can even find it's primes (which turns out to be pretty important).

Most importantly there are many ways to express these numbers, regardless of what system of numerals any strange alien culture might invent... the mathematical answers will be the same.

These numbers are the objective ideas expressed by the numerals.
0 Replies
 
BoGoWo
 
  1  
Reply Sun 17 Jul, 2005 08:56 am
here's an aplication for you;

how many numbers can dance on the head of a pin?

[and other important discriminating criteria]
0 Replies
 
Thalion
 
  1  
Reply Sun 17 Jul, 2005 09:43 am
I'm saying that mathmatics exists both in the realm of pure reason as a formal system and also isomorphistically in the "real world" where 5 meters is the 2 dimensional "distance" of 5 times the length of a meterstick, although according to the pure math it's only a number and has no "meaning." For example, the Pythagorean theorem describes a triangle but is not itself a triangle. It is only a specific relation. In physics, particles can be seen as mathematical probabilities that are isomorphistically "real" when they are observed, but are still only probabilities that fall within pure reason described by the Schrodinger equation.
0 Replies
 
John Jones
 
  1  
Reply Sun 17 Jul, 2005 12:03 pm
Cyracuz wrote:
Numbers are part of a system devised entirely by humans, for humans. Therefore every aspect of it is entirely within our grasp and understanding. The truth of it does not rely on the system itself. It is faith that sustains it. We believe it to be true, nothing more. So numbers are digits that we have given names and value. They signify something, thus forming a rich language for exploring further.


If mathematics is a set of rules, then faith would not be necessary for it to work but following the rules would.
0 Replies
 
Thalion
 
  1  
Reply Sun 17 Jul, 2005 12:31 pm
Yes, but the rules do not prove themselves. As the Godel theorem indicates, no formal system can prove itself. We accept certain axioms and the logic we use on those, but this must always be done on the faith that this is correct.
0 Replies
 
John Jones
 
  1  
Reply Sun 17 Jul, 2005 03:06 pm
ebrown_p wrote:
Quote:

Numbers are part of a system devised entirely by humans, for humans.


I think this is clearly untrue.

JJ makes the correct distinction between numerals (the symbols) and numbers (the idea behind the symbols). The numeral 5, the "khamsa" (an arabic numeral that looks like a circle and means five, the Roman numeral "V" are all different numerals. But they refer to the same number that was discovered independently by countless very different cultures.

Quote:
We must consider that the 'pre-conceived group' of numbers is not a group of numbers, but a group of numerals. I would define these as signs without an application. As soon as we put them into an application then they become numbers, but not before.
Finally, even if we have a pre-conceived group of numbers, it is difficult to see how this can allow us to claim that mathematics 'stands alone as intrinsic truth'. For this claim to be made, numbers must be a priori and discoverable.


You have this backwards.

Numbers are "a prioriori and discoverable". Intelligent beings invent "numerals" in order to express the numbers they discover, and humans have invented many different types of numerals. Countless cultures have discovered the number 5, and invented different ways to represent it.

I can also give you a number that has no practical purpose in any application Let's take 2^1721443678341-63.

This string of characters is a "numeral" in that it is a way to express a number that is understandable by educated 21st century humans. But this represents a number, an idea that could be expressed by intelligent beings in any set of numerals they have devised.

This number is not part of an application... I believe it is a number bigger than the number of atoms in the Universe. But this does not make it any less of a number.

The fact is, I can do anything with this number that I can do with the number 5. I can add it to another number, I can subtract it, I can check if it is equal to another number... I can even find it's primes (which turns out to be pretty important).

Most importantly there are many ways to express these numbers, regardless of what system of numerals any strange alien culture might invent... the mathematical answers will be the same.

These numbers are the objective ideas expressed by the numerals.


I must actually count the number of atoms in the universe, either physically or by calculation and extrapolation. So the signs that present your number are already in an application. For one thing, the signs are ordered in a sequence.

But more importantly I don't find that number. I make it. I can count the atoms as far as I want to and regard the rest as a heap. If you now argue 'no-, count the atoms as far as you can go, and that is your number', then I would say that that is not a number. The term 'as far as you can go' is not a method for making or even defining a number.
0 Replies
 
John Jones
 
  1  
Reply Sun 17 Jul, 2005 04:02 pm
Thalion wrote:
I'm saying that mathmatics exists both in the realm of pure reason as a formal system and also isomorphistically in the "real world" where 5 meters is the 2 dimensional "distance" of 5 times the length of a meterstick, although according to the pure math it's only a number and has no "meaning." For example, the Pythagorean theorem describes a triangle but is not itself a triangle. It is only a specific relation. In physics, particles can be seen as mathematical probabilities that are isomorphistically "real" when they are observed, but are still only probabilities that fall within pure reason described by the Schrodinger equation.


I think the situation is this: mathematics employs signs; it has rules for using these signs. There is no property or concept of the 'real world' that mathematics reveals through its signs. All of the metaphysical ideas such as velocity, dimensions, etc are merely mapped to the signs (numerals) of mathematics. Mapping does not signify a relationship, so mathematics can tell us nothing about 'our' world. But now the structure of mathematics itself dissolves away. For the way in which the signs are arranged in mathematics has nothing of mathematics about it. For example, the sequence of numbers is based on reading from left to right; another example, the uncalculated smaller numbers of pi are no less significant to it than the larger calculated digits (3): smallness and largeness are empiricaly based, yet are considered significant in mathematics if mathematics is said to present our world.
0 Replies
 
John Jones
 
  1  
Reply Sun 17 Jul, 2005 04:07 pm
Thalion wrote:
Yes, but the rules do not prove themselves. As the Godel theorem indicates, no formal system can prove itself. We accept certain axioms and the logic we use on those, but this must always be done on the faith that this is correct.


There is something not right with 'rules do not prove themselves'. What could possibly count as proof for a rule? Only that we follow it. If it then transpires that a situation arises where we cannot follow the rule, then that is not the same as saying that the rule does not prove itself. Wasn't Godels focus misplaced?
0 Replies
 
Thalion
 
  1  
Reply Sun 17 Jul, 2005 04:30 pm
I don't understand what you're trying to say at all.

"Mapping does not signify a relationship" - So it's done completely indiscriminately?

"Mathematics can tell us nothing about 'our' world." - It just helps us create new technology, build things, and understand physics, the study of how our universe operates?

"if mathematics is said to present our world" - You just said that it didn't.

"What could possibly count as proof for a rule?" - A derivation from the axioms.

"Only that we follow it." - So anything we follow is a rule? What?
0 Replies
 
John Jones
 
  1  
Reply Sun 17 Jul, 2005 04:45 pm
Thalion wrote:
I don't understand what you're trying to say at all.

"Mapping does not signify a relationship" - So it's done completely indiscriminately?

"Mathematics can tell us nothing about 'our' world." - It just helps us create new technology, build things, and understand physics, the study of how our universe operates?

"if mathematics is said to present our world" - You just said that it didn't.

"What could possibly count as proof for a rule?" - A derivation from the axioms.

"Only that we follow it." - So anything we follow is a rule? What?


Mapping is always arbitrary. We arbitrarily assign the concept 'mass' for example, to a set of signs used in mathematics. Mathematics cannot suggest concepts; it only regurgitates what we put into it.

Yes, mathematics tells us nothing essentially new about our world, but it may emphasise a different view of what we already know.

'If mathematics is said to present our world', which it does for those who claim that mathematics deals in intrinsic truths, a view I do not hold.

All that is required of me in a mathematical pursuit is that I follow the rules. If I have a number of rules and I find that I cannot follow one rule without breaking another, then I can make a new rule or drop one. There is no arena for 'proof' here.

If I apply a rule I am said to 'follow the rule'.
0 Replies
 
Thalion
 
  1  
Reply Sun 17 Jul, 2005 05:31 pm
Of course math tells us new things about our world. The first example that comes to mind are how the Maxwell equations proved that light moves at a constant speed, which was a completely revolutionary concept that completely contradicted physics up until that point. The same thing occured with the Uncertainty Principle.
0 Replies
 
Thalion
 
  1  
Reply Sun 17 Jul, 2005 05:49 pm
As Immanuel Kant believed, the universe is logical because it has to be.
0 Replies
 
satt fs
 
  1  
Reply Sun 17 Jul, 2005 07:54 pm
There must be many would-be theorems in math, which have not been proven or not even formulated yet. Math system tells us new things if those new theorems are found and proven.
0 Replies
 
John Jones
 
  1  
Reply Mon 18 Jul, 2005 04:57 am
satt_fs wrote:
There must be many would-be theorems in math, which have not been proven or not even formulated yet. Math system tells us new things if those new theorems are found and proven.


To say that there are new theorems in mathematics is like saying that because you have a pile of bricks, you have a house.
0 Replies
 
 

Related Topics

How can we be sure? - Discussion by Raishu-tensho
Proof of nonexistence of free will - Discussion by litewave
Destroy My Belief System, Please! - Discussion by Thomas
Star Wars in Philosophy. - Discussion by Logicus
Existence of Everything. - Discussion by Logicus
Is it better to be feared or loved? - Discussion by Black King
Paradigm shifts - Question by Cyracuz
 
  1. Forums
  2. » Number - the brutal facts
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.05 seconds on 05/19/2024 at 01:51:07