Subaru Enthusiasts Car Club of the Sierras

Subaru Enthusiasts Car Club of the Sierras (https://www.seccs.org/forums/index.php)
-   Off Topic Chat (https://www.seccs.org/forums/forumdisplay.php?f=10)
-   -   A little GA Tech humor/puzzle.. (https://www.seccs.org/forums/showthread.php?t=3738)

JC 2005-10-12 07:14 PM

A little GA Tech humor/puzzle..
 
I got this in an e-mail today, thought I'd share.

Quote:

Your advisor has left you in charge. A pesky grad student comes up to you and states that they have developed a new Neural Net. Each perceptron in the Net has an output that is a summation of its weighted inputs (no thresholds).

You yawn and state that the Net is uninteresting. Why?

(No, its not because you haven't been getting enough sleep)
lol

dknv 2005-10-12 08:42 PM

Lord, JC, what are you taking? Quantum physics?

Is the answer, because the net is a male model? ( lol.)

Good stuff, keep it coming, our (my!) brains can always use some weighted inputs.

ScottyS 2005-10-13 09:31 AM

Quote:

You yawn and state that the Net is uninteresting. Why?

Because you know that a surplus of grad students creates a scarcity of interesting projects --- as they all vie to promote their own project as critical to the advancement of science, when in reality it's just another useless niche with great buzzwords.

sperry 2005-10-13 09:34 AM

Sounds like that Neural Net can't learn anything. Even if the weights are being modified during the learning procedure, each perceptron in the layer will be outputting the same value to the next layer, thereby removing any learned behavior. Essentially, the network will act as a simple adding machine where the result is scaled by the learned weights.

...I think, I never really learned much about neural networks.

And JC, stop getting other people to do your homework. :P

Nick Koan 2005-10-13 09:38 AM

Now, correct me if I'm wrong, but won't each node only recieve and transmit the value 1?

MPREZIV 2005-10-13 10:59 AM

:?: :?: :?: :?:


wow. my head hurts.

ShawnS 2005-10-13 05:58 PM

Quote:

Originally Posted by MPREZIV
:?: :?: :?: :?:


wow. my head hurts.

Alcohol will fix that.

Dean 2005-10-13 06:09 PM

Sounds like a cascading pyramid of nodes resulting in a flat "true" at the edge of the net. Boring....

sperry 2005-10-14 10:03 AM

Quote:

Originally Posted by Dean
Sounds like a cascading pyramid of nodes resulting in a flat "true" at the edge of the net. Boring....

If all inputs are 0, then the output will be zero as well. It's essentially a massive, complicated OR gate.

Dean 2005-10-14 01:12 PM

Quote:

Originally Posted by sperry
If all inputs are 0, then the output will be zero as well. It's essentially a massive, complicated OR gate.

Sorry, I assumed there would be some actual true bits somewhere in the data. But yeah, a giant OR net...Whopie

cody 2005-10-14 02:37 PM

Um, 9?

SlickNick112 2005-10-17 06:52 AM

Quote:

Originally Posted by Dean
Sounds like a cascading pyramid of nodes resulting in a flat "true" at the edge of the net. Boring....

Ha ha ha, he said nodes!

JC 2005-10-21 09:50 AM

Answer:

The Neural Net is uninteresting because, *no matter* how large and complicated it is, it can be replaced by a single perceptron (node) for each output.

Training a single node is far faster than training 500!

A single perceptron here represents a linear combination of the weighted inputs. Without thresholds, the output will always just be a linear combination of the weighted inputs.

The perceptron can represent a vector in an N dimensional space, where N is the number of inputs. If you add two/three/a billion vectors together you just get another vector.

So the entire Net only has the power to represent a vector for each of its outputs.

This means that it can be replaced by a single node for each output with no loss of representational ability.

sperry 2005-10-21 09:57 AM

Cool, looks like we were on the right track for the answer, but I like the way it's phrased better in JC's solution.

cody 2005-10-21 09:59 AM

Damn, so close...

sonicsuby 2005-10-21 11:22 AM

Quote:

Originally Posted by ScottyS
Because you know that a surplus of grad students creates a scarcity of interesting projects --- as they all vie to promote their own project as critical to the advancement of science, when in reality it's just another useless niche with great buzzwords.

I think this is the real answer.


All times are GMT -8. The time now is 05:47 PM.

Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.
All Content Copyright Subaru Enthusiasts Car Club of the Sierras unless otherwise noted.