The first stars formed about 300 million years after the Big Bang, when the universe was still dark, out of primordial gas clouds. The first stars were made of hydrogen and helium, and no other elements — zero "metallicity." In astrophysics, a metal is any element heavier than hydrogen or helium.
The first stars were believed to be huge, hundreds of solar masses, because the chunking process of matter had just begun. The early universe was very homogeneous — there were only tiny deviations in the smooth distribution of matter. Slowly, these deviations built up, and condensed into local pockets of gas. This process took a huge amount of time because gravity is relatively weak when there isn't already a lot of matter piled into one place.
The first stars are called "Population III" stars, in contrast to the Population II stars which came after them, and Population I stars like our Sun. These later stars have much higher metal content, which influences their dynamics in important ways. Today, a star much more massive than 150 Suns could not exist. Due to the presence of oxygen, nitrogen, and carbon in the core, hydrogen fusion reactions are catalyzed, and the star would blow itself apart before it had a chance to form.
But not so with the first stars. These things were massive. Scientists believe they may have caught a small glimpse of the glow of these stars using the Spitzer Space Telescope. Without heavy elements in their cores, these stars fused hydrogen using the p-p proton process, which takes a very long time. Still, due to their extreme mass, these stars would have had very dense and hot centers, accelerating the reactions. The first stars probably existed for no more than a million years. Due to their extreme distance, we probably won't be able to observe them until telescope technology improves significantly.