It depends on which textbook you're using and what you're willing to give up in order to define it (hopefully you're gaining more than you lose).
In general, the answer is "you can't divide by zero;" however, if we stop there, then we're ignoring the fact that mathematics is often defined to work how we want it to (within reason).
Now, 0 has no sign. That is, it's neither positive nor negative, so if we do want to define 1/0, we probably don't want to give is a sign either (or we'll lose a lot more than we want). Therefore, I'm going to go ahead and define 1/0 = infinity. This is not the positive infinity or negative infinity you see, for instance, in limits; rather, this can be thought of as a completely different concept altogether.
If you want me to explain further on this, then mention it in your additional details. Otherwise, I'll leave this topic by showing one of the most important examples where mathematicians have already done this classical taboo, called the Riemann sphere:
http://en.wikipedia.org/wiki/Riemann_sphere
Again, I really want to emphasize that we only want to define 1/0 _if it is useful_ . Without this definition, one of the other answerers' conclusion is entirely true: 1/0 is undefined.
(Does it make sense now, why we call it "undefined" and not just "senseless"?)