Ok here goes:
The dewpoint temperature better describes moisture content than relative humidity. Relative humidity doesn't tell the same story.
Since your windows were open, let's assume the moisture content of the air both inside and outside, that is, the dewpoints, are the same.
The dewpoint will always be less than (or equal to) the air temperature. The closer that air temperature is to your dewpoint temperature, the higher your relative humidity % will be. So it stands to reason that your cooler house will have a higher RH than the outside since there is a greater dewpoint depression in the outside air compared to the inside air.
These relationships between air temperature and dewpoint temperature are not linear. If your air temperature is equal to your dewpoint temperature, that's when RH=100%. Colder air will hit 100%RH easier than warmer air, like in winter (but notice that doesn't necessarily feel humid when it's 30F out). Dewpoints have trouble getting above 80F (although not impossible by any means), so if you're in the 90s, you're not going to hit 100%RH, and rarely will it happen in the 80s.
Pet peeve -- people who say it was 90 degrees (or hotter) with 100%RH. Pretty much a physical impossibility. Humid days in the deep south in summer where the T=95F and the dewpoint is 75F, which is a very sticky and uncomfortable day, will have a RH less than 60%.
You can't really make any conclusions about your insulation at all based on your data.