The emergence of the first stars in the universe must have been quite a spectacular sight. The standard lore is that the first stars were probably massive, bright, and short lived.
As remarkable as these stars must have been, as well as the second generation of stars to follow them, they would all have died very young, exploding only a scant million years after they formed. Fortunately for us, they left behind traceable signatures of their existence.
You see, the bright light from these stars shone to vast distances in all directions. Wherever the light encountered a hydrogen atom, which was very nearly everywhere, it would remove the electron from that atom. It would ionize that hydrogen atom. The ionized hydrogen would surround the star out to large distances.
As an analogy, imagine unknowingly walking across a a sidewalk filled with wet cement. Your footprint would be embedded in the drying concrete. Long after you left the scene to buy new shoes, the sidewalk would still bear the footprint long after the cement had dried. This would leave behind evidence that you were there.
Likewise, the first generations of stars ionized the hydrogen gas in its surroundings. As the time it takes for the electron to be reunited with its proton is very long, the hydrogen persists in this ionized state.
Thus even after the star had died and disappeared from view, that tell-tale “footprint” in the form of ionized hydrogen would remain as evidence that a hot star was once there. Eventually such stars would come to ionize _all_ the spare hydrogen in between stars in the whole universe, a process that we call “reionization.”
Astronomers can work out the epoch during which most of the hydrogen was ionized, which, in turn, tells them when those tell-tale first stars first walked the path in the sky.