Search This Blog

Tuesday, February 24, 2009

Hosting more than a .NET remoting singleton server in the same process

I had the need of having three different singleton remoting object in the same process.
This post explains how I solved out the trick.

I have a windows application exposing via remoting three singleton objects, let' call them "S_a", "S_b" and "S_Main" .

"S_a" and "S_b" are instantiated by remote client applications.
"S_Main" object is instantiated by "S_a" and "S_b".
"S_a", "S_b" and "S_Main" resides on the same process, as I told before.
"S_a" and "S_b" are WellKnownServiceTypes.
"S_Main" is to be both a WellKnownServiceType (this is inside the server-application code where a port is open in listening mode) and a WellKnownClientType when configuring "S_a" and "S_b" for its instantiation (i.e inside "S_a" and "S_b" class definition there will be a point in which we say that "S_Main" is not a local object and is to be instantiated via remoting at a certain URL).

Triing to configure remoting in this way, leads to the generation of the Exception:
"Remoting configuration failed with the exception System.Runtime.Remoting.RemotingException"

"Attempt to redirect activation for type" [...]
"This is not allowed since either a well-known service type has already
been registered with that type or that type has been registered has a
activated service type"

The solution is to have the three singletons in three different appdomains hosted in the same windows process.

First thing to setup is to have three assemblies every one exposing a method which will configure remoting server-side for a single singleton. The classes inside these assemblies must inherit MarshalByRefObject.
In this way instantiating these three classes and calling the method on them, will open three ports in listening mode.

Every assembly is to be loaded in a different appdomain. In this way, inside the "S_A_AppDomain" (and also in "S_B_AppDomain") the "S_Main" object is configured as a WellKnownClientType, while in the S_Main_AppDomain it is configured as a WellKnownServiceType.
AppDomains provide proper isolation.

An Appdomain is created in this way:

Dim domaininfo As New AppDomainSetup
domaininfo.ApplicationBase = AppDomain.CurrentDomain.BaseDirectory

dim S_A_Domain As AppDomain = _
AppDomain.CreateDomain("S_A_RemotingServer", Nothing, domaininfo)
Here is the code used to load an assembly (ex: "Assembly.dll") inside the appdomain, instantiate the class (ex: "namespace.classname", which will reside in the newly created appdomain) and invoke the method that will configure remoting server-side (ex: "ActivateServer") :

Dim ann As [Assembly] = [Assembly].LoadFrom("Assembly.dll")

'Load the assembly in the appdomain
Dim a As [Assembly] = S_A_Domain.Load(ann.FullName)

'Create an instance (on the new appdomain)
Dim obj As Object = S_A_Domain.CreateInstanceAndUnwrap(ann.FullName, "namespace.classname")

'Get the type to use.
Dim myType As Type = a.GetType("namespace.classname")
'Get the method to call.
Dim mymethod As MethodInfo = myType.GetMethod("ActivateServer")

'Execute the method.
mymethod.Invoke(obj, Nothing)

The method is invoked in the instance of the class which resides in the separate appdomain, configuring remoting.

Events between .NET Remoting wellknown objects

Suppose we have two Wellknown singleton remoting objects.
the former is instantiated by the latter, and sends events to it.

Let's call S_A and S_B the two wellknown remoting objects.
These object reside on two different appdomains. S_B is instantiated by S_A.
S_B is the source of events, and S_A has to receive them.
Note that S_A is a wellknown on his turn.

After S_B is instantiated in S_A, an exception is thrown:

Wellknown objects cannot marshal themselves in their constructor, or perform
any action that would cause themselves to be marshalled (such as passing the
this pointer as a parameter to a remote method)

The error happens at runtime, either when a handler is added for an S_B event or as soon as S_B is instantiated.

To understand why this happens, a first resume of how events works is useful.
Events are simple to use. What happens behind the curtains, is that a function is called on the target (the object handling the event -S_A-) by the sender of the event (S_B).

Since we are talking about wellknown server objects, what happens when S_B sends an event to S_A, is:
the remoting server (S_B) becomes a client of S_A (in fact S_B is calling a function on S_A) .

The exception is thrown because the wellknown object S_B is a SERVER fof S_A, so it is not allowed to use it as also as CLIENT. This is not allowed by remoting. It is allowed elsewhere.

Create a new class (C_C). Inside this class instantiate the wellknown S_B and receive his events here. S_A will instantiate C_C and receive events from it. C_C can become a client of S_A without problem since it is not a wellknown object.

Wednesday, February 4, 2009

History of Computer with rare photos

Computer Science Lab

An Illustrated History of Computers

Part 1


John Kopplin © 2002

The first computers were people! That is, electronic computers
(and the earlier mechanical computers) were given this name because they
performed the work that had previously been assigned to people.
"Computer" was originally a job title: it was used to describe
those human beings (predominantly women) whose job it was to perform the
repetitive calculations required to
compute such things as navigational tables, tide charts, and planetary
positions for astronomical almanacs. Imagine you had a job where hour after
hour, day after day, you were to do nothing but compute multiplications.
Boredom would quickly set in, leading to carelessness, leading to mistakes. And
even on your best days you wouldn't be producing answers very fast. Therefore,
inventors have been searching for hundreds of years for a way to mechanize
(that is, find a mechanism that can perform) this task.

This picture shows what were known as "counting tables" [photo courtesy IBM]

A typical computer operation back when computers were people.

The abacus was an early aid for mathematical computations. Its only
value is that it aids the memory of the human performing the calculation. A skilled
abacus operator can work on addition and subtraction problems at the speed of a
person equipped with a hand calculator (multiplication and division are
slower). The abacus is often wrongly attributed to China. In fact, the oldest
surviving abacus was used in 300 B.C. by the Babylonians. The abacus is still
in use today, principally in the far east. A modern abacus consists of rings that
slide over rods, but the older one pictured below dates from the time when
pebbles were used for counting (the word "calculus" comes from the
Latin word for pebble).

A very old abacus

A more modern abacus. Note how the abacus is really just a representation
of the human fingers: the 5 lower rings on each rod represent the 5 fingers
and the 2 upper rings represent the 2 hands.

In 1617 an eccentric (some say mad) Scotsman named John Napier invented
logarithms, which are a technology that allows multiplication
to be performed via addition. The magic ingredient is the logarithm of each
operand, which was originally obtained from a printed table. But Napier also
invented an alternative to tables, where the logarithm values were carved on
ivory sticks which are now called Napier's Bones.

An original set of Napier's Bones [photo courtesy IBM]

A more modern set of Napier's Bones

Napier's invention led directly to the slide rule, first built
in England in 1632 and still in use in the 1960's by the NASA engineers of
the Mercury, Gemini, and Apollo programs which landed men on the moon.

A slide rule

Leonardo da Vinci (1452-1519) made drawings of gear-driven calculating machines
but apparently never built any.

A Leonardo da Vinci drawing showing gears arranged for computing

The first gear-driven calculating machine to actually be built was
probably the calculating clock, so named by its inventor, the
German professor Wilhelm Schickard in 1623. This device got little publicity
because Schickard died soon afterward in the bubonic plague.

Schickard's Calculating Clock

In 1642 Blaise Pascal, at age 19, invented the Pascaline as an
aid for his father who was a tax collector. Pascal built 50 of this gear-driven
one-function calculator (it could only add) but couldn't sell many because of their
exorbitant cost and because they really weren't that accurate (at that time it
was not possible to fabricate gears with the required precision). Up until the
present age when car dashboards went digital, the odometer portion of a car's
speedometer used the very same mechanism as the Pascaline to increment the next
wheel after each full revolution of the prior wheel. Pascal was a child
prodigy. At the age of 12, he was discovered doing his version of Euclid's
thirty-second proposition on the kitchen floor. Pascal went on to invent
probability theory, the hydraulic press, and the syringe. Shown below is an
8 digit version of the Pascaline, and two views of a 6 digit version:

Pascal's Pascaline [photo © 2002 IEEE]

A 6 digit model for those who couldn't afford the 8 digit model

A Pascaline opened up so you can observe the gears and
cylinders which rotated to display the numerical result

Click on the "Next" hyperlink below to read about the punched card system
that was developed for looms for later applied to the U.S. census
and then to computers...

Computer Science Lab Next

History of Computer

8-Bit Operating Systems (Armory.Com)

Classic 8-bit Computers


A Brief History of Computers and Networks (GoldenInk.Com)

A Chronology of Computer History (Cyberstreet.Com)

A Chronology of Personal Computers (kpolsson -

A Few Quotes from Silicon Valley History

A Journey Through the History of Information Technology

Photos: History of Computing Information

Artificial Intelligence, History of

A Timeline of Computer and Internet History


The Abacus  ( Good pictures, overview, brief history, definition, counting boards, bibliography, and links for further study)



Howard Hathaway Aiken and the Mark I

Howard Aiken: Makin' A Computer Wonder

Howard Hathaway Aiken, Computer Pioneer

Howard Aiken's Harvard Mark I

ALTAIR Computer

Altair 8800 (Wikipedia)

Introduction The Revolution Begins (David Bunnell)

The Altair 8800 and Ed Roberts

Altair History (

Early History of the Personal Computer (by Thayer Watkins at San Jose State University)

Brief History of the Altair (

The Virtual Altair Museum"

Ed Roberts Interview (

What Good is a Computer Without Software? (

How the Personal Computer Was Born (

Looking Back on Nearly Three Decades of Personal Computing (by Forrest M. Mims III)

Ed Roberts and the MITS Altair

How the Altair Began (by Stan Veit, the Computer Editor of Popular Electronics Magazine)

Chronology of Personal Computers 1975

Open Letter to Hobbyists from Bill Gates, February 3, 1976

PDF copy of An Open Letter To Hobbyists

Ramblings From Ed Roberts, March 1976



The Analytical Engine, Table of Contents

Analytical Engine/Babbage


Apple Computer history (

Steve Jobs Information Page

Making the Macintosh (

The Apple Museum (

Books on Apple History

MacPicks (Mac News, Reviews, E-zines)


ARPA, Forerunner of ARPANET and the Internet

ARPA: The Early Days of ARPA, Forerunner of the Internet


The John Vincent Atanasoff Virtual Archive

Iowa State University Department of Computer Science, birthplace of the electronic digital computer.

Atanasoff Biography (Hien Chris Do)

Washington Post Obituary

Secret of a Genius: Drive Fast and Don't Look Back
(This is an article about John Atanasoff.)

Inventors of the Modern Computer
This article from the Minining Company discusses the Atanasoff-Berry Computer,
John Vincent Atanasoff, and Clifford Berry.


Reconstruction of the Atanasoff-Berry Computer
(Article from Ames Lab)

ABC Public Showings: November 1996 and October 1997



Pioneers: Charles Babbage
Charles Babbage Biographical Notes

The Babbage Pges: Biography

Babbage's Analytical Engine

Charles Babbage Institute (U. of Minn.)

Charles Babbage - Bibliographical notes

Picture of Charles Babbage (


Biography of John Backus  Leader of the team that
developed Fortran programming language

John Backus, Recipient of Alan M. Turing Award



Gordon Bell Microsoft Bay Area Research Center; worked on design of PDP-6 at DEC, an antecedent of the PDP-10, one of the first mPs and the first time-sharing computer.

MyLifeBits Project: Vannevar Bush's 1945 Memex Vision Fulfilled


George Boole - Genius of Lincoln (Roger Parsons)

The Calculus of Logic by George Boole (

Papers by George Boole ( - University Library County Cork, Ireland)

George Boole Invents Boolean Algebra

George Boole, Biographical Notes (

BRICKLIN, DANIEL (Co-Creator of VisiCalc)

Daniel Bricklin, First Spreadsheet,VisiCalc

Daniel Bricklin's Web Page

Daniel Bricklin Biography, Long Form


BUNNELL, DAVID (Altair, MITS, PC Mag., PC World, MacWorld)

The Third Culture: David Bunnell (

Prosumer Media

Upside Tech Magazine's Downside (S.F. Chronicle)



As We May Think (by permission of
Article of 1945 by Vannevar Bush

Vannevar Bush Biographical Notes (

Vannevar Bush (


Ada Byron (, contributed by Dr. Betty Toole)

Ada's Notes Contributed by Dr. Betty Toole, Yale.Edu

The Birth of the Computer Revolution

Pictures of Lady Augusta Ada Byron, Countess of Lovelace


Calculator Reference, Research Page

Chronology of Computer History (

Chronology of Personal Computers (Ken Polsson)

Computing Science, History of

Computers: From the Past to the Present

C++ Programming Language

History of the C++ Programming Language (

History of C++ (


Timeline History of Calculators


CERF, VINTON G. Co-inventor of TCP/IP Protocol for the Internet

ICANN Biographical Notes about Vinton G. Cerf

Vint's Personal Home Page Read about the first 35 years of the "Internet"

How the Internet Came to Be as told by Vinton Cerf

Vinton G. Cerf (

Vinton G. Cerf (Wikipedia page)

Vinton Cerf on the future of e-mail (ComputerWorld: 2001 article)

From Inventing the Enterprise: Vinton G. Cerf (


CODD, EDGAR F. Database Pioneer, Key Theorist of Databases

Edgar Codd, Database Pioneer, Dead at 79 (

Edgar Codd, Key Theorist of Databases Dies at 79 in Florida


The Colossus Rebuild Project (

The Colossus Rebuild Project (

Lorenz Ciphers and the Colossus

1940-1944 The Colossus

Colossus Electronic Programmable (to a limited extent) Computer

Colossus Computer


Computer Conservation Society

Computer Conservation Society


DIJKSTRA, EDSGER WYBE Mathematician, Computer Scientist,
  Shortest Path Algorithm, Computer Programmer, Educator

Biographical Notes: Edsger W. Dijkstra (

Edsger Dijkstra, The shortest-path algorithm

Dijkstra's Algorith (

E. W. Dijkstra Archive (

Dijkstra's Algorith (

GoTo Statement Considered Harmful by Edsger Dijkstra, 1968;
   Article began the structured programming movement.

How Do We Tell Truths that Might Hurt? by Edsger Dijkstra, 1975:
  On old problems in programming

Edsger Dijkstra on universities

Edsger Dijkstra:RIP (death notice)



Mona Lisa, Digital Image from 1965

A Brief History of Digital Imaging



J. Presper Eckert Interview

The Eckert/ENIAC Collection



AUGMENTING HUMAN INTELLECT: A Conceptual Framework A summary report prepared by Douglas C. Englebart for the Air Force Office of Scientific Research, October 1962.

Original Computer Mouse Patent

Computer Mouse Demo (video info.)

Facts About the Invention of the Computer Mouse

The Computer Mouse

The Mousesite - a resource for exploring the history of human computer interaction beginning with the pioneering work of Douglas Engelbart and his colleagues at Stanford Research Institute in the 1960s.


The ENIAC Story (

A Report on the ENAIC (

ENAIC - The Army Sponsored Revolution (

John W. Mauchly and the Development of the ENIAC Computer (

The Eniac (

ENIAC Computer Invention (

Jean Bartik, The First ENIAC Programmer



Networking With Ethernet (



Fibonnaci Numbers

Fibonnaci Numbers Definition

Fibonnaci Numbers and the Golden Section

-F, G-

First, Second, Third, and Fourth Generation Computers
Features of each generation are discussed.


Google.Com's Computer History Links

Great Microprocessors of the Past and Present

Greatest Engineering Achievements of the 20th Century


An Open Letter To Hobbyists, by Bill Gates

Bill Gate's Web Page at Microsoft

Bill Gates: Biog. notes and p;icture

The "Unofficial" Bill Gates page

Focus Magazine's Interview with Bill Gates

Bill and Melinda Gates Foundation

An Open Letter to Hobbyists - by Bill Gates, 1976 (

William H. Gates III Before Microsoft

Bill Gates Wealth Index


Roanoke Times Article about Jack Good, a memeber of the team that broke the ENIGMA code during WWII

Irving John (Jack) Good Cryptologist, statistician


Al Gore's Support of the Internet
    by Robert E. Kahn and Vinton G. Cerf

Gore to Get Lifetime Award for Internet

Gore Never Did Claim That He Invented the Internet

GOSLING, JAMES wrote the Java programming language
From Inventing the Enterprise: James Gosling (

James Gosling, On the Java Road

Java - James Gosling (


Graphical User Interface

Graphical User Interface

History of the Graphical User Interface

Graphical User Interface, GUI (

Mac OS X

Aqua Human Interface Guidelines (

GUI Gallery

The History of Graphical User Interfaces


Historic Computer Images (

History of Computers Article by

History of Computer Viruses and Attacks (

History of Computing (
A compiled directory of categorical links.

History of Computing: Virtual Museum of Computing (VMoC)
This site is a virtual museum which includes an eclectic collection of WWW hyperlinks connected with the history of computing and on-line computer-based exhibits.

History of the Internet (Roads and Crossroads of Internet History (Gregory Gromov's site)

History of the Web Beginning at CERN


Grace Hopper

The Wit and Wisdom of Grace Hopper

Inventing the Enterprise: Grace Murray Hopper (


Hewlett-Packard, History -- Founding Fathers

The Making of Hewlitt-Packard

Dave Packard's 11 Simple Rules


IEEE Annals of the History of Computing

Internet History: Roads and Crossroads of Internet History (Gregory Gromov)


IBM Vintage Personal Computers


IEEE/ISTO Industry Standards and Technology Organization

IEEE Computer Society Home Portal

IEEE Computer Society History of Computing

IEEE Annals of Computer History

IEEE 802.11b - Wireless Ethernet Article by Al Petrick, Vice Chairman of the IEEE 802.11 Standards Committee

Internet Explorer

The History of Internet Explorer



Jacquard's Web: How a Hand Loom Led to the Birth of the Information Age
 by James Essinger. Read about this book at

Jacquard Loom

The Jacquard Loom (

Jacquard's Punched Card

Industrial Revolution: Timeline of Textile Machinery

Biographical Note on Joseph-Marie Jacquard

Biography of Joseph-Marie Jacquard


Steve Paul Jobs Bio.


KAHN, ROBERT E (co-inventor of TCP/IP Technology for Internet)

Robert E. Kahn Biographical Notes

Robert E. Kahn Awarded National Medal of Technology




Leonard Kleinrock Biography: The Birth of the Internet (
Leonard Kleinrock was the inventor of the Internet technology known as packet-switching.

Leonard Kleinrock's home page



Gottfried Wilhelm Leibnitz (1646-1716)


Man-Computer Symbiosis by J.C.R. Licklider (PDF file,

The Computer as a Communication Device (PDF file,

by J.C.R. Licklider and Robert Taylor (of ARPA)

JCR Licklider (

J.C.R. Licklider Biography (

J.C.R. Licklider (

Six degrees of J.C.R. Licklider traces computer evolution (St.Louis Post Dispatch/

J.C.R. Licklider in Memoriam (

The Dream Machine: J.C.R. Licklider and the Revolution That Made Computing Personal (book, by M. Mitchell Waldrop -- see book site)


Mind Machine Museum (



John W. Mauchly and the Development of the ENIAC Computer


From Inventing the Enterprise: Robert M. Metcalf (

Ethernet (

Ethernet PDF Paper: IEEE 802.3



(See also: Englebart, Douglas)

The Mouse Site



Project Xanadu
The original hypertext project...

Ted Nelson home page


Notes About Kristen Nygaard





Past Notable Women of Computing and Mathematics

PC Computers, Pre-IBM

Pioneers (Wired Magazine's List of Pioneers of Computing)

Pioneers of Computing


Blaise Pascal Quotes to Inspire You

Blaise Pascal Biography

PDP Computers

What is a PDP? (Ken Olson/DEC)

PDP-8 (

PDP-11 History (




Dennis Ritchie Home Page

Interview with Dennis M. Ritchie

Dennis M. Ritchie timeline

Dennis M. Ritchie: From Inventing the Enterprise


Shareware, History of

Silicon Valley History (

Smithsonian Institute's Computer History Collection


Shortest Path Algorith

See Dijkstra, Edsger


George R. Stibitz, Computer Inventor (1904-1995)

George R. Stibitz in Inventors Hall of Fame

George R. Stibitz Curriculum Vitae

George R. Stibitz, Biography

George R. Stibitz, 1904-1995



System R (

System R led to the introduction of the SQL language


Tech. History (

The Charles Babbage Institute
This is located at the University of Minnesota
Many computer subjects including the history of computers...

The Evolution of the Computer

The History of Shareware

The IEEE Annals of the History of Computing

The Revolutionaries

The Smithsonian Computer History and Information Technology

Timeline: The History of Computers (

Well done! Events influencing the history and development of computing.

Tools for Thought (Howard Rheingold)



The Alan Turing Home Page

Alan Mathison Turing (1912-1954)

Alan Turing and COLOSSUS



Univac Memories (

Book: The Unsung Heroes of the PC Age


Virtual Museum of Computing

Viruses: History of Computer Viruses

Wearable Computing at

What was the first computer and who built it?"

Wired Magazine:Browse by Person
This is a list not to be missed. Enjoy.


John Lewis von Neumann (1903-1957)

John von Neumann

(Click on hardware in the background of this next reference)

von Neumann Digital Computer



Dr. An Wang (Magnetic Core Memory) (

An Wang, Hall of Fame Inventor Profile

Book: Dr. an Wang Computer Pioneer

The Doctor and His Calculators

About Wang

An Wang: The Core of the Computer Era

Wang system photos

Photo 1968 Wang Calculator

Wang 363E Calculator

Wang BASIC vs. Microsoft BASIC



Maurice Vincent Wilkes (

Maurice Vincent Wilkes, Brief Biography


Niklaus Wirth (

Niklaus Wirth was the inventor of Pascal, and other languages.


Steven Wozniak Bio. (

Official Website of the "WOZ"

Apple Computer history



Project Xanadu
The original hypertext project...

Xanadu Australia: Problem Definition

Xanadu Secrets Become Udanax Open-Source (1999)

Marc Steigler's 1989 Article Published in the February 1990 Edition of Unix Review is about Hypertext Publishing





Konrad Zuse
(Use the search engine at this next site to put in the name of Konrad Zuse, in order to navigate to this topic.)

A photograph of Konrad Zuse

Konrad Zuse's Z3 Computer

Konrad Zuse And His Computers

History of science

History of science

Science is a body of empirical, theoretical, and practical knowledge about the natural world, produced by a global community of researchers making use of scientific methods, which emphasize the observation, experimentation and explanation of real world phenomena. Given the dual status of science as objective knowledge and as a human construct, good historiography of science draws on the historical methods of both intellectual history and social history.

Tracing the exact origins of modern science is possible through the many important texts which have survived from the classical world. However, the word scientist is relatively recent—first coined by William Whewell in the 19th century. Previously, people investigating nature called themselves natural philosophers.

While empirical investigations of the natural world have been described since Ancient Greece (for example, by Thales, Aristotle, and others), and scientific methods have been employed since the Middle Ages (for example, by Ibn al-Haytham, Abū Rayhān al-Bīrūnī and Roger Bacon), the dawn of modern science is generally traced back to the early modern period, during what is known as the Scientific Revolution that took place in 16th and 17th century Europe.

Scientific methods are considered to be so fundamental to modern science that some — especially philosophers of science and practicing scientists — consider earlier inquiries into nature to be pre-scientific. Traditionally, historians of science have defined science sufficiently broadly to include those inquiries.

The Tree of Life (Disney)

The Tree of Life is a massive fourteen story (145-foot (44 m)) tall artificial tree that has been the icon of Disney's Animal Kingdom since it opened on April 22, 1998. Engineered from a refitted Oil platform, it is located in the center of the park. Its leaves are made out of Kynar. On the exterior of it are carved images of 325 animals. Inside the Tree of Life is It's Tough to be a Bug!, a 3-D film hosted by Flick, from A Bug's Life. It is similar to the tree of Rafiki from The Lion King, though on a much larger scale. There is a hidden Mickey on this tree.

Monday, February 2, 2009

How to Behave in the Workplace

How to Behave in the Workplace

· Men and women will commonly shake hands, especially in a business setting. Handshakes are firm. A limp handshake signifies a weak personality or character.

  • Many companies have a probationary period. Most employees will work for 90 days. After 90 days, the employer can decide if the person is the right one for the job. Some companies offer benefits after the probationary period.
  • Confrontation is a common part of communication. At work, a supervisor may discuss a problem directly with a worker and expect the worker to do the same.
  • While confronting another person, people are polite. Confrontations are not meant to hurt feelings or provoke a fight.
  • Many companies give employee reviews or evaluations. The purpose of these evaluations is to let employees know what they are doing well and how they can improve.
  • In the workplace, people don’t usually talk about divorce, family problems, or financial problems.
  • It is not customary to ask a person how much money they or their family make. This is especially true in the workplace.
  • Men and women work as equals in all working environments. There is no division between gender in the workplace.
  • In the U.S., if a family member works for an organization, it doesn’t necessarily mean that you will get a job or have advantages over other employees.
  • Most companies and organizations have a missions statement that lists the overall values, goals, and purpose of the organization.
  • It is common for all employees to attend staff meetings. These meetings may be weekly, bi-weekly, or even monthly. At staff meetings, it is a time of sharing of information, brainstorming, or catching up on company business.
  • Some companies, depending on their size and authority structure, may have separate departmental meetings in addition to or instead of staff meetings.
  • Punctuality is very important in the workplace.
  • It is not considered disloyal to quit your job. However, it is common practice and expected that you give your employer two weeks notice before leaving. Sometimes, the employer may even want more time. By following this rule, an individual will usually leave the company or organization with good relations.
  • Though it is normal for people to switch jobs, an employer can get frustrated if an employee leaves right after they get hired or right after they’ve completed training for the job. Depending on the professional, different amounts of time are expected for an employee to work there. For example, a fast food restaurant may not expect its workers to stay any longer than a couple of months. Employers in the medical profession, however, often ask employees to commit to at least a year of work when they are hired.

Safety in the Workplace

· Workplace safety is very important in the United States. Employers and employees must follow strict safety laws established by the government.

· The U.S. has an agency called OSHA (Occupational Safety and Health Administration). The mission of this agency is to save lives, prevent injuries and protect the health of America’s workers. OSHA employees work to make sure employers and employees are following strict safety standards.

· According to OSHA guidelines, an employer is responsible for the safety of the employee. She/he must train all employees and provide a safe working environment and can be held accountable for work-related accidents and injuries.

· According to OSHA guidelines, an employee is responsible for following all rules and regulations and can be held accountable for work-related accidents and injuries if they don’t follow the rules.

· If an employee feels they are working in an unsafe working environment and the employer isn’t following OSHA guidelines, he or she can file a complaint with OSHA.

· If an accident happens in the workplace, it is standard procedure to complete an accident report form. If the accident is severe, there may even be an investigation.

· Many companies post safety procedures around the workplace. If there is a rule posted, you can assume that it must be followed.

The Seven Habits of HIghly Effective People

The Seven Habits of HIghly Effective People

Learning to Manage and Live Life in an Effective and People-Focused Way
By Stephen R Covey

This is one of the best-known leadership books of recent years, and the key phrase in the title is "people-focused". Rather than tackling specific problems or making external changes to processes, systems and so on, Covey's approach helps you focus on developing yourself personally and your relationships with others.

First published in 1989, the Seven Habits of Highly Effective People explains a useful set of guiding principles that help you change personally as well as professionally, and so become more effective.

Covey describes three distinct stages of personal growth that we move through as we develop these habits:

  • Dependence: This is where we start, dependent on other people. And without personal development, we would stay stuck at this stage.

  • Independence: Through personal development, we become more independent and take responsibility for our actions. Still, however, we are not fully effective.

  • Interdependence: At this stage, we develop the understanding that, although we are self-reliant, we still need other people to accomplish our goals. At the interdependence stage we embrace the idea of working together for better results.

The Seven Habits Explained

The Seven Habits help us move through these three stages of personal development. The first three take you from dependence to independence. The next three usher you along to interdependence, and the seventh is needed to reinforce the others.

1. Be Proactive
2. Begin with the End in Mind
3. Put First Things First
4. Think Win-Win
5. Seek First to Understand, Then to be Understood
6. Synergize
7. "Sharpen the Saw"

How To Build The Seven Habits

The Seven Habits need to be developed over time. Remember that these are “habits” – that means you have to pursue them consciously for a while before they become part of who you are and how you interact with other people.

To develop these Seven Habits, we strongly recommend that you study Dr Stephen Covey’s book in detail, and that you make the effort needed to make them part of your life.

And as you read it and do this, bear in mind the following Mind Tools articles, tools and courses that support and reinforce the Seven Habits:

Habits 1 & 2: Be Proactive and Begin with the End in Mind
Supporting these habits, see our articles on:

By developing a personal mission statement and proactively setting and managing your goals, you will have a clear view of where you are heading. And to take this to its logical extent, see our "Design Your Life" self-study course, which helps you think through what you want to do with your life in detail.

Habit 3: Put First Things First
Managing your time is key to developing this habit and becoming more effective. Among the tools at Mind Tools that help with this are:

However we also have a full-blown time management section, and our Make Time for Success course teaches the 39 essential skills you need to take full control of your time and maximize your effectiveness.

Habit 5: Seek First to Understand, Then to be Understood
A key skill to develop is Active Listening. This will help you deepen understanding of others and so grow into the interdependence stage of personal development.

Habit 4 & 6: Think Win-Win & Synergize
Win-Win Negotiation and using synergy are some of the skills of a good leader. There is more about leadership in the Mind Tools leadership section, and you can learn the 48 skills needed to be a truly effective leader in our "How to Lead" self-study course.

Habit 7: Sharpen the Saw
As you work on developing the Seven Habits, it’s good to keep evaluating where you are going and how you are progressing through the stages of personal development. It takes time to develop new habits, and developing new skills will help embed your habits, so you become more ever more effective. Subscribe to our free newsletter to receive new career skills every two weeks.

Key Points

Steven Covey's Seven Habits help you develop personally and so become more effective in how you work and relate with other people. Developing these habits can help you tackle your work and life challenges with new confidence. At the core of these habits are a deeper understanding of yourself and an appreciation of the fact that you need others in order to achieve your goals. Developing them will take time and effort. But it is worthwhile and will have a lasting effect on your personal effectiveness.

Modern Science

The Beginning of Modern Science

E pur si muove -- And yet it moves.

Galileo Galilei, sotto voce after his trial.

One thing that happened during the Renaissance that was of great importance for the later character of modern philosophy was the birth of modern science. Even as in the Middle Ages philosophy was often thought of as the "handmaiden of theology," modern philosophers have often thought of their discipline as little more than the "handmaiden of science." Even for those who haven't thought that, the shadow of science, its spectacular success and its influence on modern life and history, has been hard to ignore.

For a long time, philosophers as diverse as David Hume, Karl Marx, and Edmund Husserl have seen the value of their in work in the claim that they were making philosophy "scientific." Those claims should have ended with Immanuel Kant (1724-1804), who for the first time clearly provided a distinction between the issues that science could deal with and those that it couldn't, but since Kant's theory could not be demonstrated the same way as a scientific theory, the spell of science, even if it is only through pseudo-science, continues.

The word "science" itself is simply the Latin word for knowledge: scientia. Until the 1840's what we now call science was "natural philosophy," so that even Isaac Newton's great book on motion and gravity, published in 1687, was The Mathematical Principles of Natural Philosophy (Principia Mathematica Philosophiae Naturalis). Newton was, to himself and his contemporaries, a "philosopher." In a letter to the English chemist Joseph Priestley written in 1800, Thomas Jefferson lists the "sciences" that interest him as, "botany, chemistry, zoology, anatomy, surgery, medicine, natural philosophy [this probably means physics], agriculture, mathematics, astronomy, geography, politics, commerce, history, ethics, law, arts, fine arts." The list begins on familiar enough terms, but we hardly think of history, ethics, or the fine arts as "sciences" any more. Jefferson simply uses to the term to mean "disciplines of knowledge."

Something new was happening in natural philosophy, however, and it was called the nova scientia, the "new" knowledge. It began with Mikolaj Kopernik (1473-1543), whose Polish name was Latinized to Nicolaus Copernicus. To ancient and mediaeval astronomers the only acceptable theory about the universe came to be that of geocentrism, that the Earth is the center of the universe, with the sun, moon, planets, and stars moving around it. But astronomers needed to explain a couple of things: why Mercury and Venus never moved very far away from the sun -- they are only visible a short time after sunset or before sunrise -- and why Mars, Jupiter, and Saturn sometimes stop and move backwards for a while (retrograde motion) before resuming their forward motion. Believing that the heavens were perfect, everyone wanted motion there to be regular, uniform, and circular. The system of explaining the motion of the heavenly bodies using uniform and circular orbits was perfected by Claudius Ptolemy, who lived in Egypt probably during the reign of the Emperor Marcus Aurelius (161-180). His book, still known by its Arabic title, the Almagest (from Greek Tò Mégiston, "The Greatest"), explains that the planets are fixed to small circular orbits (epicycles) which themselves are fixed to the main orbits. With the epicycles moving one way and the main orbits the other, the right combination of orbits and speeds can reproduce the motion of the planets as we see them. The only problem is that the system is complicated. It takes something like 27 orbits and epicycles to explain the motion of five planets, the sun, and the moon. This is called the Ptolemaic system of astronomy.

Copernicus noticed that it would make things a lot simpler (Ockham's Razor) if the sun were the center of motion rather than the earth. The peculiarities of Mercury and Venus, not explained by Ptolemy, now are explained by the circumstance that the entire orbits of Mercury and Venus are inside the Earth's orbit. They cannot get around behind the Earth to be seen in the night sky. The motion of Mars and the other planets is explained by the circumstance that the inner planets move faster than the outer ones. Mars does not move backwards; it is simply overtaken and passed by the Earth, which makes it look, against the background, as though Mars is moving backwards. Similarly, although it looks like the stars move once around the Earth every day, Copernicus figured that it was just the Earth that was spinning, not the stars. This was the Copernican Revolution.

Now this all seems obvious. But in Copernicus's day the weight of the evidence was against him. The only evidence he had was that his system was simpler. Against him was the prevailing theory of motion. Mediaeval physics believed that motion was caused by an "impetus." Things are naturally at rest. An impetus makes something move; but then it runs out, leaving the object to slow down and stop. Something that continues moving therefore has to keep being pushed, and pushing is something you can feel. (This was even an argument for the existence of God, since something very big -- like God -- had to be pushing to keep the heavens going.) So if the Earth is moving, why don't we feel it? Copernicus could not answer that question. Neither was there an obvious way out of what was actually a brilliant prediction: If the stars did not move, then they could be different distances from the earth; and as the earth moved in its orbit, the nearer stars should appear to move back and forth against more distant stars. This is called "stellar parallax," but unfortunately stellar parallax is so small that it was not observed until 1838. So, at the time, supporters of Copernicus could only contend, lamely, that the stars must all be so distant that their parallax could not be detected. Yeah, sure. In fact, the absence of parallax had been used since the Greeks as more evidence that the Earth was not moving.

It is common now in many venues for people to say that heliocentric astronomy was rejected by the Greeks and ignored in the Middle Ages just because of the human arrogance that wanted the Earth to be the center of the universe -- we belong in the center of things. There were certainly some people who thought that way, but it is hard to imagine that all Greeks, or all Mediaevals, were so foolish. They weren't. The little morality tale we are given of Mediaeval ignorance and anthropo-centrism overlooks the problem that there was no evidence of heliocentrism in Ancient or Mediaeval science, that Copernicus himself did not supply any evidence, and that it was the Ancient and Mediaeval understanding of the physics that was dead against the Earth moving. Usually these treatments don't even mention the physics. The only evidence that Stephen Hawking mentions against Ptolemaic astronomy (in his A Brief History of Time) at the end of the Middle Ages is that the Moon, moving on an epicycle, would move away from and towards us in a way that would dramatically change its apparent size. Unfortunately, Copernicus retained an epicycle for the motions of the Moon, which means that this problem with Ptolemaic astronomy is equally a problem for Copernican astronomy. Only Johannes Kepler (1571-1630) would fix things by replacing epicycles with elliptical orbits. That Copernicus supplied no compelling evidence for this theory led Thomas Kuhn to think that Copernicanism won out only because of social, not evidentiary, factors. But then Copernicanism did not triumph until Galileo, and the evidentiary situation with Galileo was much different than it had been with Copernicus.

Copernicus was also worried about getting in trouble with the Church. The Protestant Reformation had started in 1517, and the Catholic Church was not in any mood to have any more of its doctrines, even about astronomy, questioned. So Copernicus did not let his book be published until he lay dying.

The answers, the evidence, and the trouble for Copernicus's system came with Galileo Galilei (1564-1642). Galileo is important and famous for three things:

  1. Most importantly he applied mathematics to motion. This was the real beginning of modern science. There is no math in Aristotle's Physics. There is nothing but math in modern physics books. Galileo made the change. It is inconceivable now that science could be done any other way. Aristotle had said, simply based on reason, that if one object is heavier than another, it will fall faster. Galileo tried that out (though it had already been done by John Philoponus in the 6th century) and discovered that Aristotle was wrong. Aerodynamics aside, everything falls at the same rate. But then Galileo determined what that rate was by rolling balls down an inclined plane (not by dropping them off the Leaning Tower of Pisa, which is the legend). This required him to distinguish between velocity (e.g. meters per second) and acceleration (change in velocity, e.g. meters per second per second). Gravity produced an acceleration -- 9.8 meters per second per second. Instantly Galileo had an answer for Copernicus: simple velocity is not felt, only acceleration is. So the earth can be moving without our feeling it. Also, velocity does not change until a force changes it. That is the idea of inertia, which then replaced the old idea of an impetus. All this theory was ultimately perfected by Isaac Newton (1642-1727).

  2. With the objections to Copernicus's theory removed, the case was completed with positive evidence. Around 1609 it was discovered in the Netherlands that putting two lenses (which had been used since the 13th century as eye glasses) together made distant objects look close. Galileo heard about this and himself produced the first astronomical quality telescope. With his telescope he saw several things: a) the Moon had mountains and valleys. This upset the ancient notion that the heavens, the Moon included, were completely unlike the Earth. b) the Planets all showed disks and were not points of light like stars. c) Jupiter had four moons. This upset the argument, which had been used against Copernicus, that there could only be one center of motion in the universe. Now there were three (the Sun, Earth, and Jupiter). d) There were many more stars in the sky than could be seen with the eye; and the Milky Way, which always was just a glow, was itself composed of stars. And finally e) Venus went through phases like the Moon. That vindicated Copernicus, for in the Ptolemaic system Venus, moving back and forth at the same distance between the Earth and the Sun, would only go from crescent to crescent. It would mostly have its dark side turned to us. With Copernicus, however, Venus goes around on the other side of the Sun and so, in the distance, would show us a small full face. As it comes around the Sun towards the Earth (in the evening sky), we would see it turn into a crescent as the disk grows larger. Those are the phases, from small full to large crescent, that Galileo saw. So that he could claim priority to this discovery, before actually announcing it, Galileo concealed his claim in an anagram that unscrambled to Cynthiae figuras aemulatur mater amorum, "The forms of Cynthia [the moon], the mother of loves imitates." The only argument that could be used against Galileo for all these discoveries was that the telescope must be creating illusions. In fact it was not well understood why a telescope worked. Some people looked at stars and saw two instead of one. That seemed to prove that the telescope was unreliable. Soon it was simply accepted that many stars are double. They still are.

  3. With his evidence and his arguments, Galileo was ready to prove the case for Copernican astronomy. He had the support of the greatest living astronomer, Johannes Kepler, but not the Catholic Church. He had been warned once to watch it, but then a friend of his (Maffeo Barberini) became Pope Urban VIII (1623-1644). The Pope agreed that Galileo could write about both Ptolemaic and Copernican systems, setting out the arguments for each. Galileo wrote A Dialogue on the Two Principal Systems of the World (1632). Unfortunately, the representative of the Ptolemaic system in the dialogue was made to appear foolish, and the Pope thought it was a caricature of himself. Galileo was led before the Inquisition, "shown the instruments of torture," and invited to recant. He did, but was kept under house arrest for the rest of his life. Nevertheless, it was too late. No serious astronomer could ever be a geocentrist again, and the only discredit fell against the Church. As Galileo left his trial, he is supposed to have muttered, E pur si muove -- "And yet it moves."

Some think less of Galileo because he recanted his beliefs, while Socrates was willing to die for his. Well, there has been no more civilized example of a death penalty than when Socrates got to sit around, talk to his friends, calmly drink the hemlock, and lie down to a peaceful death -- the "sweet shafts," the agana belea, of Apollo's silent arrows. Galileo was threatened with torture. No one can be faulted for saying anything under those circumstances.

Indeed, the history of science subsequently often consists of who gets to claim the status of Galilean martyrdom. The interesting cases in our day concern Global Warming and Evolution. The weight of Offical Science -- journals like Nature, the National Science Foundation, or the Royal Society of Britain -- is all for Global Warming and for Evolution. The Right complains that multiple equivalents of Galileo are oppressed because they stand up for the heretical truth, that Global Warming and Evolution are frauds. Unfortunately, this confuses very different issues. Evolution is in no danger from any real science; and the Right wastes a great deal of money and effort (such as Ben Stein's movie Expelled) promoting theology and bad metaphysics as some sort of "science." On the other hand, the Global Warming "consensus" is a product of politics, not science. The Right thus plays right into the hands of Al Gore, who is happy to lump "Intelligent Design" and Global Warming skepticism as equally part of an "assault on reason." This when a great deal of enthusiasm for the Global Warming cause follows from hostility to science itself, in so far as it represents human progress and the betterment of human life on earth. Thus, between the Earth Liberation Front and the Creationists (not to mention Post-Modernist nihilism), there is little real interest in the modern tradition of science begun by Copernicus and Galileo.

The "Sin" of Galileo

Philosophy of Science

History of Philosophy

Home Page

Copyright (c) 1996, 1998, 2006, 2008 Kelley L. Ross, Ph.D. All Rights Reserved

René Descartes (1596-1650) and the Meditations on First Philosophy

Descartes is justly regarded as the Father of Modern Philosophy. This is not because of the positive results of his investigations, which were few, but because of the questions that he raised and problems that he created, problems that have still not been answered to everyone's satisfaction: particularly the Problem of Knowledge and the Mind-Body Problem. And in a day when philosophy and science were not distinguished from each other, Descartes was a famous physicist and mathematician as well as a philosopher. Descartes' physics was completely overthrown by that of Newton, so we do not much remember him for that. But Descartes was a great mathematician of enduring importance. He originated analytic geometry, where all of algebra can be given geometrical expression. Like Galileo combining physics and mathematics, this also combined two things that had previously been apart, arithmetic and geometry. The modern world would not be the same without graphs of equations. Rectangular coordinates for graphing are still called Cartesian coordinates (from Descartes' name: des Cartes). Descartes is also the person who began calling the square root of -1 (i.e. -1) the "imaginary" number. Descartes lived in an age of great mathematicians, including Marin Mersenne (1588-1648), Pierre Fermat (1601-1665), Blaise Pascal (1623-1662), and Christian Huygens (1629-1695). At a time before scientific journals, Mersenne himself mediated a correspondence between all these people (as well as with Galileo, Thomas Hobbes, and many others). All prime numbers that are powers of 2 minus 1 (i.e. 2n - 1) are still called "Mersenne primes." Huygens then lived long enough to know Isaac Newton (1642-1727).

Seeing Descartes as a mathematician explains why he was the kind of philosopher that he was. Now it is hard to reconcile Descartes' status as a scientist and the inspiration he derived from Galileo and others with his clear distrust of experience. Isn't science about experience? We might think so. But the paradox of modern science is its dependence on mathematics. Where does mathematics come from? What makes it true? Many mathematicians will still answer that they are "Platonists," but Plato's views certainly have little to do with experience. So Descartes belongs to this puzzling, mathematical side of science, not to the side concerned with experience.

Meditations on First Philosophy is representative of his thought. "First philosophy" simply means what is done first in philosophy. The most important thing about Descartes as a philosopher is that "first philosophy" changed because of what he did. What stood first in philosophy since Aristotle was metaphysics. Thus the first question for philosophy to answer was about what is real. That decided, everything else could be done. With such an arrangement we can say that philosophy functions with Ontological Priority. In the Meditations we find that questions about knowledge come to the fore. If there are problems about what we can know, then we may not even be able to know what is real. But if questions about knowledge must be settled first, then this establishes Epistemological Priority for philosophy. Indeed, this leads to the creation of the Theory of Knowledge, Epistemology, as a separate discipline within philosophy for the first time. Previously, knowledge had been treated as falling in the domain of Aristotle's logical works (called, as a whole, the Organon), especially the Posterior Analytics. Modern philosophy has been driven by questions about knowledge. It begins with two principal traditions, Continental Rationalism and British Empiricism. The Rationalists, including Descartes, believed that reason was the fundamental source of knowledge. The Empiricists believed that experience was. Epistemological priority makes possible what has become a very common phenomenon in modern philosophy: denying that metaphysics is possible at all, or even that metaphysical questions mean anything. That can happen when epistemology draws the limits of knowledge, or the limits of meaning, so tight that metaphysical statements or questions are no longer allowed.

The most important issues get raised in the first three of the six Meditations. In the first meditation Descartes begins to consider what he can know. He applies the special method that he has conceived (about which he had already written the Discourse on Method), known as "methodical doubt." As applied, methodical doubt has two steps: 1) doubt everything that can be doubted, and 2) don't accept anything as known unless it can be established with absolute certainty. Today Descartes is often faulted for requiring certainty of knowledge. But that was no innovation with him: ever since Plato and Aristotle, knowledge was taken to imply certainty. Anything without certainty would just be opinion, not knowledge. The disenchantment with certainty today has occurred just because it turned out to be so difficult to justify certainty to the rigor that Descartes required. Logically the two parts of methodical doubt are very similar, but in the Meditations they are procedurally different. Doubt does its job in the first meditation. Descartes wonders what he can really know about a piece of matter like a lump of wax. He wonders if he might actually be dreaming instead of sitting by the fireplace. Ultimately he wonders if the God he has always believed in might actually be a malevolent Demon capable of using his omnipotence to deceive us even about our own thoughts or our own existence. Thus, there is nothing in all his experience and knowledge that Descartes cannot call into doubt. The junk of history, all the things he ever thought he had known, gets swept away.

Ever since the Meditations, Descartes' Deceiving Demon has tended to strike people as a funny or absurd idea. Nevertheless, something far deeper and more significant is going on in the first meditation than we might think. It is a problem about the relation of causality to knowledge. The relation of cause to effect had been of interest since Aristotle. There was something odd about it. Given knowledge of a cause (and of the laws of nature), we usually can predict what the effect will be. Touch the hot stove, and you'll get burned. Step off a roof, and you'll fall. But given the effect, it is much more difficult to reason backwards to the cause. The arson squad shows up to investigate the cause of a fire, but that is not an easy task: many things could have caused the fire, and it is always possible that they might not be able to figure out at all what the cause was. The problem is that the relation between cause and effect is not symmetrical. Given a cause, there will be one effect. But given an effect, there could have been many causes able to produce the same effect. And even if we can't predict the effect from the cause, we can always wait around to see what it is. But if we can't determine the cause from the effect, time forever conceals it from us. This feature of causality made for some uneasiness in mediaeval Western, and even in Indian, philosophy. Many people tried to argue that the effect was contained in the cause, or the cause in the effect. None of that worked, or even made much sense.

With Descartes, this uneasiness about causality becomes a terror in relation to knowledge: for, in perception, what is the relation of the objects of knowledge to our knowledge of them? Cause to effect. Thus what we possess, our perceptions, are the effects of external causes; and in thinking that we know external objects, we are reasoning backwards from effect to cause. Trouble. Why couldn't our perceptions have been caused by something else? Indeed, in ordinary life we know that they can be. There are hallucinations. Hallucinations can be caused by a lot of things: fever, insanity, sensory deprivation, drugs, trauma, etc. Descartes' Deceiving Demon is more outlandish, but it employs the same principle, and touches the same raw nerve. That raw nerve is now known as the Problem of Knowledge: How can we have knowledge through perception of external objects? There is no consensus on how to solve this even today. The worst thing is not that there haven't been credible solutions proposed, there have been, but that the solutions should explain why perception is so obvious in ordinary life. Philosophical explanations are usually anything but obvious; but no sensible person, not even Descartes, really doubts that external objects are there. This is why modern philosophy became so centered on questions about knowledge: it is the Curse of Descartes.

In his own discussion, Descartes does not identify his problem as resulting from the asymmetry of cause and effect as applied to knowledge. However, this is what underlies his difficulty, and an explicit statement of the matter does not have long to wait. In 1690, Bishop Pierre-Daniel Huet, a member of the French Academy, wrote that any event can have an infinite number of possible causes. Huet was certainly aware of Descartes' work (as any Frenchman by then would have been), and certainly took his epistemological difficulties seriously. Indeed, Huet's book was a celebration of epistemological difficulties, entitled a Philosophical Treatise on the Weaknesses of the Human Mind. We also get an interesting but confused discussion of the asymmetry of cause and effect in the Sherlock Holmes stories.

In the second meditation, Descartes wants to begin building up knowledge from the wreckage of the first meditation. This means starting from nothing. Such an idea of building up knowledge from nothing is called Foundationalism and is one of the mistakes that Descartes makes. Descartes does not and cannot simply start from nothing. Nevertheless, he gets off to a pretty good start: he decides that he cannot be deceived about his own existence, because if he didn't exist, he wouldn't be around to worry about it. If he didn't exist, he wouldn't be thinking; so if he is thinking, he must exist. This is usually stated in Latin: Cogito ergo sum, "I think therefore I am." That might be the most famous statement in the history of philosophy, although it does not seem to occur in that form in the Meditations.

But there is more to it than just Descartes' argument for his own existence. Thinking comes first, and for Descartes that is a real priority. The title of the second meditation actually says, "the mind is better known than the body," and the cogito ergo sum makes Descartes believe, not just that he has proven his existence, but that he has proven his existence as a thinking substance, a mind, leaving the body as some foreign thing to worry about later. That does not really follow, but Descartes clearly thinks that it does and consequently doesn't otherwise provide any special separate proof for the existence of the soul. In the end Descartes will believe that there are two fundamental substances in the world, souls and matter. The essence of soul for him, the attribute that makes a soul what is it, is thinking. The essence of matter for him (given to us in the fifth meditation), the attribute that makes matter what is it, is extension, i.e. that matter takes up space. This is known as Cartesian Dualism, that there are two kinds of things. It is something else that people have thought funny or absurd since Descartes. The great difficulty with it was always how souls and their bodies, made of matter, interact or communicate with one another. In Descartes' own physics, forces are transferred by contact; but the soul, which is unextended and so has no surface (only matter has extension), cannot contact the body because there is no surface to press with. The body cannot even hold the soul within it, since the soul has nothing to press upon to carry it along with the body. Problems like this occur whenever the body and soul are regarded as fundamentally different kinds of realities.

Today it might seem easy to say that the body and soul communicate by passing energy back and forth, which doesn't require contact, or even proximity; but the presence of real energy in the soul would make it detectable in the laboratory: any kind of energy produces some heat (towards which all energy migrates as it becomes more random, i.e. as energy obeys the laws of the conservation of energy and of entropy), and heat or the radiation it produces (all heat produces electromagnetic radiation) can be detected. But, usually, a theory of the soul wants it to be some kind of thing that cannot be detected in a laboratory -- in great measure because souls have not been detected in a laboratory.

Nevertheless, Descartes' problem is not just a confusion or a superstition. Our existence really does seem different from the inside than from the outside. From the inside there is consciousness, experience, colors, music, memories, etc. From the outside there is just the brain: gray goo. How do those two go together? That is the enduring question from Descartes: The Mind-Body Problem. As with the Problem of Knowledge, there is no consensus on a satisfactory answer. To ignore consciousness, as happens in Behaviorism, or to dismiss consciousness as something that is merely a transient state of the material brain, is a kind of reductionism, i.e. to say the one thing is just a state or function of another even though they may seem fundamentally different and there may be no good reason why we should regard that one thing as more real and the other less so. Much of the talk about the Mind-Body Problem in the 20th century has been reductionistic, starting with Gilbert Ryle's Concept of Mind, which said that "mind is to body as kick is to leg." A kick certainly doesn't have much reality apart from a leg, but that really doesn't capture the relationship of consciousness to the body or to the brain. When the leg is kicking, we see the leg. But when the brain is "minding," we don't see the brain, and the body itself is only represented within consciousness. Internally, there is no reason to believe the mind is even in the brain. Aristotle and the Egyptians thought that consciousness was in the heart. In the middle of dreaming or hallucinations, we might not be aware of our bodies at all.

At the end of the second mediation Descartes may reasonably be said to have proven his own existence, but the existence of the body or of any other external objects is left hanging. If nothing further can be proven, then each of us is threatened with the possibility that I am the only thing that exists. This is called solipsism, from Latin solus, "alone" (sole), and ipse, "self." Solipsism is not argued, advocated, or even mentioned by Descartes, but it is associated with him because both he and everyone after him have so much trouble proving that something else does exist.

The third meditation is Descartes' next step in trying to restore the common sense limits of knowledge. Even though he is ultimately aiming to show that external objects and the body exist, he is not able to go at that directly. Instead the third meditation is where Descartes attempts to prove the existence of God. This is surprising, since the existence of objects seems much more obvious than the existence of God; but Descartes, working with his mathematician's frame of mind, thinks that a pure rational proof of something he can't see is better than no proof of something he can.

Descartes' proof for God is not original. It is a kind of argument called the Ontological Argument (named that by Immanuel Kant, 1724-1804). It is called "ontological" because it is based on an idea about the nature of God's existence: that God is a necessary being, i.e. it is impossible for him not to exist. We and everything else in the universe, on the other hand, are contingent beings; it is possible for us not to exist, and in the past (and possibly in the future) we have indeed not existed. But if God is a necessary being, then there must be something about his nature that necessitates his existence. Reflecting on this, a mediaeval Archbishop of Canterbury, St. Anselm (1093-1109), decided that all we needed to prove the existence of God was the proper definition of God. With such a definition we could understand how God's nature necessitates his existence. The definition Anselm proposed was: God is that than which no greater can be conceived. The argument then follows: If we conceive of a non-existing God, we must always ask, "Can something greater than this be conceived?" The answer will clearly be "Yes"; for an existing God would be greater than a non-existing God. Therefore we can only conceive of God as existing; so God exists.

This simple argument has mostly not found general favor. The definitive criticism was given by St. Thomas Aquinas (who otherwise thought that there were many ways to prove the existence of God): things cannot be "conceived" into existence. Defining a concept is one thing, proving that the thing exists is another. The principle involved is that, "Existence is not a predicate," i.e. existence is not like other attributes or qualities that are included in definitions. Existence is not part of the meaning of anything. Most modern philosophers have agreed with this, but every so often there is an oddball who is captivated by Anselm. Descartes was such an oddball.

Descartes' argument for God is not even as good as Anselm's. It runs something like this:

  1. I have in my mind an idea of perfection.
  2. Degrees of perfection correspond to degrees of reality.
  3. Every idea I have must have been caused by something that is at least as real [in objective reality, what Descartes calls "formal reality"] as what it is that the idea represents [in the subjective reality of my mind, what Descartes confusingly calls "objective reality"].
  4. Therefore, every idea I have must have been caused by something that is at least as perfect as what it is that the idea represents.
  5. Therefore, my idea of perfection must have been caused by the perfect thing.
  6. Therefore, the perfect thing exists.
  7. By definition, the perfect thing is God.
  8. Therefore, God exists.

Here Descartes uses "perfection" instead of Anselm's "greatness." The difficulties with the argument are, first, that the second premise is most questionable. Most Greek philosophers starting with Parmenides would have said that either something exists or it doesn't. "Degrees" of reality is a much later, in fact Neoplatonic, idea. The second problem is that the third premise is convoluted and fishy in the extreme. It means that Descartes is forced into arguing that our idea of infinity must have been caused by an infinite thing, since an infinite thing is more real than us or anything in us. But it seems obvious enough that our idea of infinity is simply the negation of finitude: the non-finite. The best that Descartes can ever do in justifying these two premises is argue that he can conceive them "clearly and distinctly" or "by the light of nature." "Clear and distinct ideas," are how Descartes claims something is self-evident, and something is self-evident if we know it to be true just by understanding it's meaning. That is very shaky ground in Descartes' system, for we must always be cautious about things that the Deceiving Demon could deceive us into believing. The only guarantee we have that our clear and distinct ideas are in fact true and reliable is that God would not deceive us about them. But then the existence of God is to be proven just in order that we can prove God reliable. Assuming the reliability of clear and distinct ideas so as to prove that God is reliable, so as to prove that clear and distinct ideas are reliable, makes for a logically circular argument: we assume what we wish to prove.

Descartes' argument for God violates both logic and his own method. In sweeping away the junk of history through methodical doubt, Descartes wasn't supposed to use anything from the past without justifying it. He is already violating that in the second mediation just by using concepts like "substance" and "essence," which are technical philosophical terms that Descartes has not made up himself. In the third meditation Descartes' use of the history of philosophy explodes out of control: technical terminology ("formal cause," etc.) flies thick and fast, the argument itself is inspired by Anselm, and the whole process is very far from the foundational program of starting from nothing. All by itself, it looks like a good proof of how philosophy cannot start over from nothing.

With the existence of God, presumably, proven, Descartes wraps things up in the sixth meditation: if God is the perfect thing, then he would not deceive us. That wouldn't be perfect. On the other hand, when it comes to our perceptions, God has set this all up and given us a very strong sense that all these things that we see are there. So, if God is no deceiver, these things really must be there. Therefore, external objects ("corporeal things") exist. Simple enough, but fatally flawed if the argument for the existence of God is itself defective.

In the fourth and fifth meditations Descartes does some tidying up. In the fourth he worries why there can be falsehood if God is reliable. The answer is that if we stuck to our clear and distinct ideas, there would be no falsehood; but our ambitions leap beyond those limits, so falsehood exists and is our own fault. Descartes does come to believe that all our clear and distinct ideas are innate: they are packed into the soul on its creation, like a box lunch. Most important is the idea of perfection, or the idea of God, itself, which is then rather like God's hallmark on the soul. Once we notice that idea, then life, the universe, and everything falls into place. Thus, Descartes eventually decides that the existence of God is better known to him than his own existence, even though he was certain about the latter first.

The fifth meditation says it is about the "essence" of material things. That is especially interesting since Descartes supposedly doesn't know yet whether material things existed. It's like, even if they don't exist, he knows what they are. That is Descartes the mathematician speaking. Through mathematics, especially geometry, he knows what matter is like -- extended, etc. He even knows that a vacuum is impossible: extended space is the same thing as material substance. This is the kind of thing that makes Descartes look very foolish as a scientist. But the important point, again, is not that Descartes is unscientific, but that he chose to rely too heavily on the role of mathematics in the nova scientia that Galileo had recently inaugurated. Others, like Francis Bacon (1561-1626), had relied too heavily on the role of observation in explaining the new knowledge; and Bacon wasn't a scientist, or a mathematician, at all. Descartes was. It really would not be until our own time that some understanding would begin to emerge of the interaction and interdependency between theory and observation, mathematics and experience in modern science. Even now the greatest mathematicians (e.g. Kurt Gödel, 1906-1978) tend to be kinds of Platonists at heart.

History of Philosophy

Home Page

Copyright (c) 1996, 1998, 2006, 2007 Kelley L. Ross, Ph.D. All Rights Reserved

Hit Counter

View My Stats