Food spoilage and food contamination are often lumped into the same category of food “going bad”; however, they’re actually very different from one another. Both signify food not fit for consumption, but they have some distinct differences. Discover some of those differences here.
Food spoilage refers to the process of food becoming unfit to eat due to changes in its chemical or physical properties. For example, bacteria may begin to grow within or on the product. You can see some of the most common instances of food spoilage in dairy products. A gallon of milk turning lumpy and giving off a horrid smell would be an example of food spoilage, as would mold beginning to grow on cheese.
Products at risk of food spoilage are marked by an expiration date or best used by date to identify when the food is at risk of spoiling. Expiration date determination is one of the many applications of chromatography, which can analyze the product to determine how long its organic properties will last.
The biggest difference between food spoilage and food contamination is that food spoilage typically occurs naturally over time, whereas food contamination is a result of the food-making process. Food can become contaminated at any point in this process. From harmful pesticides in the soil to chemicals in the manufacturing or packaging process, food can become contaminated with agents unfit for ingestion. Some contamination can even come from environmental or microbial factors.
Chromatography methods can also test for food contamination to determine if the food has come in contact with any items or substances it should not have.
The difference between food spoilage and food contamination can be summed up in the way the food becomes unfit for ingestion, such as natural spoilage over time or contact with harmful contaminants. Regardless, food spoilage and contamination both present dangers we should be wary of, such as food poisoning.