Cryptography is as old as communication itself in many respects, with people (and even animals) developing mechanisms to shield messages from those who are not trusted. One of the most common that has passed the test of time is that of the Caesar Cipher. The Caesar Cipher is a basic substitution approach, changing each alphabet letter with a new letter, n positions away. So if your movement was by 3, A would become D, B would become E and so on. Pretty simple to use, but obviously simple to reverse too.
Modern day cryptography is generally broken into two areas - symmetric and asymmetric. Symmetric uses the same key to both encrypt the plain text and decrypt. Again, this is nice and simple to implement, but no matter complex the key could be, if the key is stolen, the message can be easily decrypted back in to the original plain text.
Over time, asymmetric encryption has become popular, mainly through the implementation of public key infrastructures. PKI requires two keys, one public that is generally used to encrypt messages, and a second private key that is used to decrypt. The private key, as the name suggests, is kept secret generally password protected and local to the de-cryptor. Public keys are made available to whoever wants to encrypt a message to the recipient.
A common mistake, is to use the term encryption and hashing interchangeably. Hashing is a one way function that takes a variable sized piece of plain text data and creates a fixed size block of data that is unreadable to the human eye. The complexity of the hashing function should be so, that no two pieces of plain text create the same hash digest. This is known as collision avoidance. It is impossible to retrieve the plain text from a complex hashing function (or so should be the case), hence hashing is often used for password storage. To check confirm password equality, an entered password is passed through the hash function and compared to the original hash, as opposed to decrypting the encrypted value and comparing in plain text. Encryption can be reversed, hashing in theory is irreversible.
Whilst there are attempts at breaking both PKI and hashing infrastructures (rainbow tables are often seen as the most plausible way of breaking a non-salted hash), encryption infrastructures are often only as strong as the weakest link.
There are several other factors involved in a complex encryption or hashing infrastructure than just the strength of the algorithms and functions being used.
Human factors play a large role in this infrastructure too. How are keys being stored? What happens to decrypted data once it has been read or used? Are any keys or unencrypted data stored in temporary files anywhere?
If SSL access is being applied to a secure website, that level of security can become undermined if the underlying database is not secured or is accessible via Telnet or FTP for example.
Whilst the encryption of sensitive data, both at rest and in transit, is a key part in information security, the people, process and technology points of such an infrastructure, mustn't be ignored or deemed to be less significant.
You are only as strong as the weakest link, which was perfectly exemplified by the breaking of the Enigma code machine during World War II, when a huge break through occurred simply due to German operator error.
Don't let that operator exist in your organisation.