What is information? Entropy, both physical and Shannon's, are a quantitave measure of the load of raw information a system contains or is able to contain. Information in this case can be interpreted with data, zeroes and ones. Consider a volume of space. Bekenstein showed that the maximum load of information, of zeroes and ones, is a proportional to the bounding surface of the volume. The maximum is actually reached if the mass inside this volume reaches that of a black hole with a Schwarzschild radius of a sphere with the volume involved. The number of Planck-areas (10exp-70 square meter) on that surface can be compared with the number of zeroes and ones on a memory chip. A gas in a container can carry a pretty big number of ones and zeroes. But compare this with the number of ones and zeroes that potentially be contained in a spherical container with a surface of one square meter: 10exp70, which can only be reached if there is enough matter to form a black hole with a Schwarzschild radius of about 0.3 meter.
It's a different question what the actual information is about. The zeroes and ones in a computer are just a means to compare with an actuality that depends on what we give them as a load. Likewise, the maximum information on the surface of a black hole (which is temporary entangled with the inner matter to resolve the information paradox) doesn't mean nothing as long we don't know from what the hole was formed. The surfaces black hole made from a super massive bike and one of a giant rabbit with the same mass, contain the same amount of information, but its obvious that the two pieces of info point to kind of different objects, the bike and the rabbit.