Here's a little something newer than Apollo, that reflects poorly upon something that's quite old;
"Mars is continuously bombarded by radiation from the galaxy at large, as well as by periodic bursts from the sun. The radiation would expose astronauts in orbit to an effective dose 2.5 times greater than that received by humans in low Earth orbit aboard the international space station, Zeitlin said."
If Skylab were referenced for the low Earth orbit radiation exposure;
As such there's was hardly any significant solar radiation to begin with, especially if working those limited EVAs in Earth's shadow. Skylab started itself off as of 14 May 1973 at the elevation of 435 km > 305 km by December 1982. Skylab 4 (84 days worth / Highest Skin Dose) accumulated 17,800 mrem*/mission = 212 mrem/day including those 4 short (I believe partially Earth shaded) EVAs while being situated near 350 km.
According to that more recent space radiation report pertaining to the Mars environment; 2.5 X Skylab's 212 mrem/day = 530 mrem/day being in Mars orbit and, that's not only already benefiting from the mere 50% solar exposure but also of the spacecraft itself being well shielded (far better than Apollo), as well as for all that being much further away from the sun than our moon, unlike those fully solar exposed Apollo missions, like how about at least 50+ million miles further. I'm not your space radiation expert like wizard Jay/http://clavius.org but, I'd have to believe that would create any fixed 100% solar exposed Mars orbit, such as MS-L1, of perhaps at least 1000 mrem/day (ouch!).
Another sore point of view, or perhaps another misunderstanding, is by our comparing to Earth L2, whereas Earth L4/L5 are certainly worthy of being classified as radiation hot zones, except for the fact that the lunar surface facilitates one better off by the surrounding you with a sufficiently dense terrain that's capable of creating a good deal of secondary radiation, as in X-Rays. On the other hand, Wizard Jay seems to know otherwise and, whereas he'll inform you that the moon as well as for other places like EL4/EL5 are just the opposit, especially on the surface of the moon offers nearly zilch worth of solar/cosmic radiation, especially if you're questioning anything Apollo.
Fortunately that's both good and bad news; good in that it's certainly great news for supporting all those cold-war Apollo ruse/sting missions that managed to endure as little as 20 mrem/day (actually that's 10 mrem/day in free space travel if you excluded the Van Allen zone of death and any sort of lunar EVAs) but, if it's your butt that's on the line, as in traveling to/from the moon or of such other terrific/horrific destinations as frozen and radiated to death Mars, you could be seriously irradiated to either being seriously DOA or soon thereafter dying miserably as a result of what the URL clavius.org has to propose. Whereas my village idiot advice is this; not to go anywhere based upon the skewed science and skewed physics involved with anything Apollo.
I'll concur that the lunar surface comprised of 3.41 g/cc is hardly as dense as aluminum. OOPS! at 2.76 g/cc, apparently aluminum is considerably less dense, thereby the secondary radiation affecting EVA exposed astronauts should have been greater, not lesser, or at the very least similar to whatever their being associated within the lander or perhaps worse off behind the thicker command module aluminum shield, especially since mass of greater density is exactly the primary cause of secondary radiation, thus is why hydrogen offers a more ideal substance for a shield that would create the least of those nasty X-Rays (volume wise you'll certainly need lots of H).
Lunar Space Elevator and the CM/ISS bites into the ESE fiasco;
As a possible solution or darn good means to an end, for obtaining sufficient shielding, of preferably that which is going to become sufficiently dirt cheap and made available in orbit, for that little task I've undertaken to study the possibilities of a lunar space elevator. These two following pages might just do the trick (2nd page offering fewer words):
Unlike all the nice (pro-NASA) folks at clavius.org and of their acting as being bonafide radiation engineer/experts, that spend their every waking hour defending a certain pagan god, whereas I've taken on quite a bit of a learning curve as well as a great deal of flak (including their's), simply because I wanted to learn something of what's possible about Venus L2, as opposed to Earth L2, which means having to learn about and/or discover what our L4/L5 environment has to offer. Whom would have thought that such a simple quest for radiation data would have been met with such big-gun obstruction and/or disinformation damage control and, where exactly is all this ulterior motivation coming from, and why?.
I've learned from others that the bulk density of the Moon is reported at more or less roughly 3.41 g/cc, thereby 2 meters worth of that mysterious lunar clumping soil and mixed lunar rock are supposedly worth 682 g/cm2. BTW; those Apollo supporters have never once utilized a hand operated post hole digger in the desert, where there's a thousand times more moisture than on the moon, such as where all that highly reflective Portland cement like clumping lunar soil exist.
If according to Henry Spencer (another space techno god), locations like L4/L5 will take at least 2 meters worth of said density (682 g/cm2) in order to block the bulk of cosmic radiation as well as fending off whatever secondary radiation, as otherwise his statement of requiring 3 meters worth is suggesting roughly 1024 g/cm2. Of what I've otherwise learned is also what I've come to believe is a reasonably fair assumption for long term surviving on the lunar surface, or that equally as for open but sufficiently shielded space travel, as well as for that which is similar to the environment of what's existing at Earth L4/L5 may quite well necessitate using all of 1024 g/cm2 and then some, at least I'd previously come up to that figure without any input from Henry Spencer.
Though for some reason the likes of those folks at clavius.org would have you and myself believing that the 100% solar exposure (24/7) environment at Earth L4/L5 and/or the solar illuminated moon is simply nothing so horrific, not even during a solar maximum phase or of solar minimum, at least not that 10 g/cm2 wouldn't more than suffice, especially as proven when those Apollo astronauts acquired as little as 20 mrem/day with their overall shield average of something below 5 g/cm2, especially if you included the aluminum foil lander plus those 36 hours worth of EVAs, in which case I believe their overall mission shield average could drop to nearly 2 g/cm2. At 20 mrem/day, that's only 7.3 rem per year, a bloody walk in the park considering the mere 5 g/cm2 or less average shielding involved. Good grief folks, that's not even 1/10th of what's to be encountered at L2 while being situated behind 10 g/cm2 is having to offer and, that's 85% blocked by Earth plus X% amount blocked and/or deflected by our surrounding magnetosphere.
It's apparent to me that one must apply a reasonable safety factor of up to 2.5 times greater shield density in order to allude or at least significantly diminish secondary radiation as a result of having to deal with whatever solar minimum has to offer, as opposed to what's being created by anything solar maximum. Ideally the benefits of affording some distance between a primary (outer) shield and that of an inner shield, with the void in between displaced with perhaps light-water or better yet hydrogen is likely too much to ask for, because of the sheer bulk and/or volume rather than compact mass/density of solids like aluminum, whereas volume soon becomes the greater difficulty in getting such volume consuming substances into orbit, let alone headed off to visit another moon or planet.
What this 1024 g/cm2 requirement as for annually shielding oneself from lunar as well as any L4/L5 environment tells me, is that the 24/7 solar influx upon the L4/L5 environment is indeed something much worse off than Earth L2, as likewise the solar illuminated moon offers a sufficiently radiated hot spot as compared to Earth L2, especially if you add in those contributing secondary radiation factors.
If an annual dosage worth of exposure at L2 = 92 rem/year while you're situated behind 10 g/cm2 (not to mention while being 85% behind Earth plus X% amount behind our magnetosphere), where obviously that's not going to be sufficient by itself unless you've got banked bone marrow standing by, whereas on the other hand it's oddly being specified by the likes of Henry Spencer that L4/L5 would be in need of as much as 1024 g/cm2. Without my even getting too complicated, that's more than 100 fold greater shielding for presumably accommodating something that's supposedly, according to those nice folks at clavius.org, a whole lot less than the 92 rem/year. If we were to assume that 1024 g/cm2 might accommodate you with 9.2 rem/year, of which most of us could certainly learn to live with (even though we wouldn't live quite as long), I believe that's still at minimum representing a 10 if not a hundred fold greater radiation influx over that of what's at L2, due in part because there's no damn partial planetary shade as situated at L2 and, the same goes for those Apollo missions having no stinking shade whatsoever, plus their having to deal with some (many hours worth) of the Van Allen zone of death plus all the secondary TBI induced by solar minimum as well as a few modest flares worth of solar maximum radiation combined with those naked 36 hours worth of EVAs while you're being accommodated in little more than an air tight jump suit.
Now then, as for speaking of dosage reduction as a ratio of density; L2 at 1 g/cm2 = 2.46^3 rem/year, whereas all things being equal at 10 g/cm2 = 91.8 rem/year. Unless my math is incorrect (always a possibility), that's roughly a 27:1 improvement created by a 10 fold increase in mass. Actually it's not linear but exponentially greater as you further increase upon density, where as the next ten fold seems to represent a 50+ fold improvement and of the subsequent next ten fold increase in mass could very well represent a 100 fold reduction (at this point we're only dealing with secondary radiation issues), so that's certainly not the 27 X 27 but more likely a factor 50 X 100 or 5000:1 reduction in those nasty secondary x-rays, in going from 10 g/cm2 to 1024 g/cm2.
Reductions of mostly secondary radiation (according to con_x_dose1.pdf)
0.1 to 1 g/cm2 = 13:1
1.0 to 10 g/cm2 = 27:1
10 to 100 g/cm2 = 50:1
100 to 1024 g/cm2 = 100:1
Obviously the last two categories represent my own village idiot estimates, though somewhat based upon the previous two density shifts.
If 10 fold in mass from 1 g/cm2 to 10 g/cm2 affects a 27 fold radiation reduction, then I was at first thinking that perhaps another 10 fold mass might represent at the very least a 729 fold reduction, though perhaps more then likely since this chart is clearly going exponentially nonlinear (as the con_x_dose1.pdf clearly indicates), whereas another 10 fold increase could just as easily exceed 27 X 50, or how about at the very least 1024:1, while a 100 fold might thereby represent 50 X 100 or how about attributing at least 4096:1 worth in reductions of those secondary radiations noted at 10 g/cm2.
In other words, again according to Henry Spencer and any number of souls like Henry, if long-term surviving L4/L5 requires up to 3 meters worth of density that's similar to lunar stuff (3.41 g/cc) or perhaps 2 meters of Earth like substance, that's obviously creating a shield density of 1024 g/cm2, and as such clearly indicates that the radiation (including the subduing of those secondary X-Rays) will have to be cut to a survivable rate of hopefully well below 9.2 rem/year as opposed to the L2 @10 g/cm2 being a rather testy 92 rem/year. I'm not sure what their annual target for dosage was but, I'd be inclined to vote for not more than 1 rem or that's roughly 1% of being shielded with 10 g/cm2 at Earth L2, especially if we're speaking of my lifetime.
That target of achieving 1% dosage of the Earth L2 environment having but 10 g/cm2 (92 rem/year X 1% = 920 mr/year), as opposed to obtaining 1024 g/cm2 worth of moon dirt (3 meters worth of 3.41 g/cc = 1024 g/cm2) as for surviving L4/L5, where even this lesser density than Earth substance seems to be indicating something which I simply can't otherwise get a straight answer upon, though my reverse engineering should work towards discovering why those intending upon residing at Earth L4/L5 were so requiring of so much mass that was even greater in density than what mostly basalt could accommodate.
The option of accepting a greater annual dosage obviously reduces their L4/L5 need of shielding, though how much dosage is acceptable varies by just about anyone you care to discuss these issues, especially if it's for a career that spans over the 20+ employment years of a given individual, whereas I'll suppose that a dosage of 10 rem/year may even become an acceptable risk, whereas a short term military risk level assessment could certainly be as great as 100 rem/year, that is if they can manage to get their worth out of the individual before his or her organs start dropping off line.
For some thoroughly odd reason, I was previously under the impression or allusion, as kindly provided by all the pro-NASA as well as pro-Apollo camps, that our Van Allen belts or zones were of the major (not minor) benefit to our survival, responsible for creating the vast bulk of Earth's shield, thereby achieving our current level of 1 mr/day exposure and, if in fact the Van Allen imposes a mere 200:1 benefit, that's certainly worth the effort, as I'll take 1 mrem/day as opposed to 200 mrem any day of the week, month or year, not to mention a lifetime worth that wouldn't be all that long if we couldn't manage to adapt/evolve into managing with such higher dosage. Although, that also expresses of what's existing beyond the Van Allen zone of death is in fact considerably more irradiated, as in mostly solar radiation hot and nasty then we've been told, especially for the likes of L4/L5, and of the moon itself must represent quite a great deal of Sv/y, like perhaps 10e3 Sv/y.
Whom is kidding whom; any way you care to cut it, there's a good deal of solar/cosmic radiation at L4/L5, as equally more so while being situated on the moon, specifically because of those secondary X-Rays created from all that clumping soil and rocks which do not exist at L4/L5. This one has become another Duh! as another significant correction in at least my misunderstanding that's become a whole lot more clear and believable than what's been published as NASA's Apollo bible, under their chapter of "Space Radiation Is A Just Another Walk In The Park".
Once again, I must remind some of you of what I've learned, that the solar minimum is worse off than solar maximum, as whatever solar maximum event actually reduces the secondary radiation intrusion by a great extent, thus I've learned that solar storms can easily be survived with relatively minimal shielding, though the remainder of any day, week, month or year in open space, or worse off as being situated on the moon or even Mars (due in part to all the secondary radiation created by the surface itself), where if that's in an environment being of solar minimum is simply not another walk in the park, not by any long shot. For dealing with solar minimums on any long-haul you'll need an overkill of shielding, in the order of 1024 g/cm2 which sounds and computes just about right unless you're already past due for your TBI prior to receiving that bone marrow transplant or, have nothing to lose because you're either too damn old or you've got something a whole lot worse off than radiation that's killing you. For the short haul you might get yourself by on 100 g/cm2, that is if you're striving for something like 35+ mrem/day, which just about any of us can certainly live with for a few weeks, even years if in fact that's all there is. Only with DNA building steroids and/or other supplements can one survive on 274 mrem/day, even then you may need to utilize your banked bone marrow come the end of a two year mission, as not everyone is going to respond favorably to medications and food supplements.
Thus, how in hells' irradiated kitchen those Apollo missions with their 36 hour EVAs worth of near naked exposure managed against solar flare activity as well as from all that solar minimum creating secondary X-Rays with as little as that averaging out to 20 mrem/day is simply not in the cards (not even a stacked deck is going to cut it), whereas climbing mount Everest is more likely to exceed 35 mrem. Not surprisingly, other government agencies and certainly of foreign space agencies do not base their exposure risk assessment upon anything Apollo, as why should they?
Even solar maximum creates secondary radiation, just not quite as much as solar minimum. However, the solar maximum is otherwise horrifically bad new on those EVAs. So, either way, you simply can't win this argument if you're suggesting those Apollo missions benefited from the ongoing solar maximum phase, at least not without otherwise frying those 36 hour EVAs and, that goes for the film in those essentially unprotected cameras.
Quite obviously my opponents do not compute, they just "spin" and "damage control" as instructed until them cows come home, though as I've previously stated, someone in NASA/NSA/DoD has already eaten them there cows at one of those incriminating document roasting barbecues, so guess what folks, they're never coming home. So obviously no one can budge an inch, as one false move and their entire house of cards is coming down. No wonder those supporting the NASA/NSA/DoD cold-war sting/ruse are not about to accept anything about Venus (good or bad), I wouldn't either, especially after I reread the nondisclosure policy paragraph pertaining to enforcement and lack of any external help should I slip, as in my soul not to mention my ass is theirs to do whatever they see fit.
Here's some of our very own NASA scientific evidence on the Van Allen zone and more.
I've noticed how we never seem to get a straight answer, as well as no specific numbers. That'll be especially true of Earth L4 and L5 because, of whatever EL4/EL5 have to offer is nearly exactly what the moon receives, though the substance of the moon itself is what subsequently creates secondaries, which are mostly of X-Ray class radiation.
Elsewhere on this Earth, there's any number of ongoing tit for tats under various topic/subject, besides this GOOGLE: "Moon hoax as American as apple pie"
Otherwise checkout my: https://guthvenus.tripod.com/space-radiation.htm
There's also that good reference report, officially NASA moderated no less, which tells a great deal more than the pro-Apollo cult wants you to know. Then there's the TRW Space Data report that published their research as indicating 2^5 rem/year as opposed to the con_x_dose1.pdf report that stipulates 7.42^3 rem/year based upon the same shield density of 2+ g/cm2, that's roughly 27 times greater dosage for the mid Van Allen zone than even the NASA new guard report was willing to convey or perhaps more likely was providing for the Van Allen zone average and not of any specific GEO zone.
Keeping in mind; It's not what's outside the space craft that actually matters, it what's being TBI to death (that's you) inside by secondary radiation, as that's what the NASA report (con_x_dose1.pdf) offers insight and even by a whole lot more so to the point of what TRW stipulates, as within the Van Allen GEO zone being 2^3 Sv/year situated behind 2+g/cm2 (300 mils) of solid aluminum (that's 2^5 rem or rads/year and, that's also in orbit which is 50% exposed to whatever direct solar flux and/or shaded by mother Earth).
The amount of Apollo to/from Van Allen travel-through time was at the very least 4+ hours worth but, Others and myself calculated as much as 7+ hours because, by one conservative estimate and even more so by another, the spacecraft average travel-through speed was not the 11+ km/sec which many of the pro-Apollo cult try to stipulate, nor was it of any straight and/or tangent to/from shot and certainly it wasn't of any polar escape route.
The amount of shield necessary to have pulled their average interior dosage well below 20 mrem/day would have needed to be something like 70 to 100 g/cm2. Keeping in mind that those raw EVAs were a bit testy and, the lander itself was little more than aluminum foil, so that their hourly accumulations would have been much greater than within their command module.
Atmosphere + space filled with H2 as a shield:
As reference; In somewhat elevated and/or N/S locations on Earth, you and I receive roughly 365 mrem/year or 1 mrem/day, climb mount Everest and you'll get lots more.
Obviously 590 km is residing sufficiently below the vast majority of the Van Allen soup base and, subsequently giving us a starting point that's indicating how much further reduction our atmosphere and of the void/distance in between accomplishes (like 274,000:1).
L2 is relatively radiation hot, but nowhere as hot as L4/L5:
At Earth L2 (0.0 g/cm2) there's roughly 6^5 rem/year or 1.64^3 rem/day, however L2 is being 85% shielded or blocked by mother Earth itself as well as receiving benefit from Earth's magnetosphere deflecting and/or altering some of the worst that our sun has to offer.
Earth L4 or L5 are indeed relatively hot zones as compared to L2, at least ten fold and perhaps even as much as 100 fold hotter because, of the 100% plus solar coverage and obviously 24/7 solar/cosmic exposure as well as there being absolutely no magnetosphere deflection benefits whatsoever. That makes EL4/EL5 raw exposure worth as much as 6^7 rem/year (1.64^5/day), that is if you included all of the solar flare attributes for a given solar-active year (why wouldn't you?).
Of course, a solar minimum year you could cut 75% right off the top, perhaps even 90% if we're excluding some of the worst our sun has to offer, that's still leaving 1.5^7 to as little as 6^6 rem/year (16.4^3 rem/day) and, wouldn't you just know it, the solar minimum is actually what introduces by far the most lethal secondary (X-Ray) radiation component inside a shielded craft or that of being situated anywhere near to almost any such density of material. So, you're definitely in a lose-lose proposition unless you've got some fairly terrific speed in order to make your mission as short as possible, and/or you've banked some of your own bone marrow for injection upon your return, as the option of packing along 100 g/cm2 is not very realistic unless it's in the format of a relatively small personal travel pod/coffin (just in case multi-use as a coffin), as an external environment of 6.8^2 rem/hr is certainly not worth the EVA risk, unless we're talking about 10 minutes worth.
Van Allen zone performance as a shield:
If you apply some of your own math you can easily get yourself confused, at least I do. However, a rough idea or comparison can be extrapolated as to the amount of radiation subdued by our atmosphere, as well as that which is subdued by the Van Allen zone (typically 1,000 km out to 70,000 km). By reading from those charts again, it clearly looks as though the atmosphere and of the space in between Earths' atmosphere and 590 km does a whole lot more (274,000:1) good for us than any Van Allen zone of death, though the magnetosphere is perhaps best at defending Earth from those solar winds loaded with iron particles and a bloody host of other nasties that interact with oneanother, making the Van Allen zone a relatively poor location to spend any amount of time because, the radiation is coming at you from all directions, plus there's always whatever direct solar flux has for you.
Even if we utilized the solar maximum of EL4/EL5 being 6^7 rem/year, minus what's existing at 590 km (also using solar maximum) as being 3^5 rem/year, that's only a ratio of reducing the radiation influx by 200:1, although if you consider what the Van Allen zone is otherwise stoping or modifying is perhaps somewhat more lethal than of what our atmosphere stops, where that's another issue that I'm not prepared to share any data on until I learn more about trapped particle radiation and of what a good magnetosphere can otherwise absorb and/or deflect in the other categories of solar flux/weather.
BTW; these are essentially not my numbers, they are mostly NASA numbers, with the exception that since I can't seem to locate anything specific on EL4 or EL5, so I've extrapolated the best that I can. If I'm wrong, either tell me what's what or else sue me.
Since such a large number of pro-everything-NASA souls have calculated their surviving space travel radiation exposures as based heavily upon those infamous Apollo missions (achieving 35 mrem or less per day), I can't but wonder upon the moral implications and perhaps carnage of consequences if such factors were somehow skewed, as in misleading or intentionally orchestrated in order to justify the outcome of mostly our cold-war "moon race", as well as subsequently for all those lunar images which not only failed to indicate radiation fogging but they oddly offered no indications of thermal stress whatsoever (that's a fairly neat trick for plastic film, unless that was of custom spy grade emulsion applied on even more specialized mylar), where even Fuji film is substantially less radiation affected than Kodak, though just as prone to thermal limits as Kodak.
The rest of this page is offered as intended, for some much needed and ongoing clarifications, of my sharing what I've recently learned about the sorts of reverse logic needed in order to extract and then understand basic space radiation and, of subsequently shielded space travel that's offering loads of Total Body Irradiation (TBI) by way of secondary (Bremsstrahlung) X-Ray radiation, such as where more of something solar flare worthy is actually creating less of a shielded dosage issue than of what's been previously thought of as less solar activity being a good thing (not so anymore, as solar minimum and/or the lull periods in between solar flares is being by far the most occupant intrusive).
Well now; let me tell you about this latest round of orchestrated tid bits of entirely unnecessary confusion (unnecessary that is unless you're trying to hide something).
Indeed a solar maximum event is nearly always a bad thing, especially as for being unshielded, as in fully EVA exposed (near naked) to whatever comes along is truly not a favorable situation, especially during any of those greater solar storm outputs that are preceded by the observation of various sunspots and/or preceded by sort of giant magnetic sink holes that have been known to emit considerable mega tonnes worth of super charged iron particles, including an assortment of other nasty radiation that's offering a base average of solar flux that'll easily exceed 45 rem per hour and of lasting more than all day long (I believe I've correctly understood that's 45 rem/hr was something based upon the L2+MAP Phasing Loops which offer an average yearly term, where there's but one solar maximum event taken into account (that's not even the biggest of events), thus during the solar event itself [typically lasting less than a day but potentially representing 75% to as much as 90% of the annual solar output] could easily be exceeding several thousand rem/hr, especially compounded within certain mid to outer Van Allen zones). The largest of recorded solar outputs has supposedly exceeded 2600 rem, where if that's an amount being delivered in a 12 hour period is by itself providing an additional 217 rem/hr, plus the usual background of perhaps 18.5 rem/hr and, lo and behold, you've got 235.5 rem/hr coming at you, though even that figure may be tempered by the fact that most satellite and ISS missions involve 12 or less hours daily exposure (that's not an option for L4/L5) as well as being significantly defended by the Van Allen magnetosphere (also not an option for L4/L5).
At zero shield, based upon residing at the somewhat cool Earth L2, an 85% Earthly shaded plus somewhat still benefiting from whatever the Van Allen deflected radiation safety zone has to offer, whereas the raw data clearly indicates a likelihood of there being as much as 6.5^5 rems per year (74.2 rem/hr), which is certainly delivering a whole lot more wallop than previously reported by those Apollo missions that were nowhere sufficiently sheltered (possibly 7 g/cm2 for a portion of their mission), nor as offered by the later (1994) SIMM MARS Version 1.00 report which had stipulated upon fully exposed dosage of a rather optimistic or pathetic 400 rem per year, of which has also recently been blown way out of the water as being another under-statement by relatively grand proportions (TRW Space Data = 2^3 Sv/year at GEO).
In another of my efforts to establish the least possible external impact (solar minimum); Since we're talking about a solitary (one time only) annual solar maximum event which is most likely representing a duration of less than 24 hours and, if that event were representing as much as 75% of the annual solar flux; we may have the following conclusions to offer:
SOLAR MINIMUM (much worse off for those shielded)
6.5^5 * .25 = 1.625^5 rem/year
1.625^5 / 365 = 4.45^2 / 24 = 18.54 rem/hr (.185 Sv/hr)
Confusion rains; the notable difference between a solar maximum and of solar minimums as per what's getting through 10 g/cm2 is actually of what really matters. According to the file: con_x_dose1.pdf, the dosage created by a solar minimum of those shielded by 10 g/cm2 is at least 1.25 times greater than solar maximum (1.64 times greater as indicated at 590 km as well as 1.575 based upon the SIMM MARS report), thus the hourly dosage based upon the previous SOLAR MINIMUM is somewhat misleading or skewed by a rather large factor, as the unshielded impact of 0.185 Sv/hr is NOT of what's important, whereas the shielded and thereby secondary radiated component is the one and only factor unless you're planning upon doing a few hours worth of solar TBI therapy via exposed EVAs.
Do remember; this next estimate upon solar minimum is based upon worst case of what's to be expected if fully exposed at EL2 (excluding any solar flare contributions), NOT that of EL4 or EL4, as either of these L4/L5 locations are those situated in a radiation hot zone that ott to be represented by at least another ten fold increase (similar of exposure levels that should have been encountered by our Apollo missions), where shielding of 100 g/cm2 will likely be of great help towards fending off this sort of solar minimum as well as for any medium/maximum solar event(s), although it's the remaining bulk solar minimum exposure issues that'll still be toasting your butt by way of creating secondary radiation.
The L2+MAP chart indicates that 0.0 g/cm2 radiation of 6.5^5 could exist, with a reduction affected by 10 g/cm2 of shielding, from which the 74.2 rem/hr (.742 Sv/hr) is derived, of which 10.5 mrem per hour is all of what's being suggested as getting through and/or of Bremsstrahung created radiation by the 10 g/cm2 shield itself.
If L2 solar minimum creates 0.1854 Sv/hr external and, that's contributing 10.5 * 1.25 = 13 mrem/hr as shielded by 10 g/cm2, now we're obviously less radiated than as for being naked, even though we've acquired a bit of an accumulation issue from those secondary issues.
A roughly conservative L5 estimate extrapolated from the same (L2 + MAP) annual solar activity that's including the one maximum event;6.5^6 / 365 / 24 = 742 rad or rems per hour (7.42 Sv/hr)
If that were to be shielded by the same 10 g/cm2 = 105 mrem/hr
Applying the 1.25 factor as the conservative multiplier for the purpose of estimating the solar minimum secondary radiation factor and we're now at 131 mrem/hr.
In order to further establish the raw or base solar minimum impact for L5, if we took entirely away the greater 75% contribution as for representing the solar maximum event;
6.5^6 * .25 = 1.625^6 /365 = 4.45^3 / 24 = 185.4 rem/hr
The L5 of 185 rem/hr of continual solar generated background radiation, of which .07% is getting past and/or creating secondary radiation behind 10 g/cm2 is still simply not a good sign, as that's 129.5 mrem/hr = 3.1 rem per day and, that's representing the very least, where at L5 or L4 there isn't decent shade to be had (somewhat exactly like being on the moon unless blocked by Earth).
In case you're wondering; one hours worth of such an allotment (7.42 Gray or Sv/hr) is almost instant death on the spot, or at the very least within a few miserable days afterword (thus EVAs are nearly out of the question) and, even if by having good shielding you managed to survive 100+ hours worth of that onslaught (not to mention months or years worth), there's always any number of future complications that'll keep coming back to remind you of what a thoroughly stupid idea it was to travel in space, especially anywhere outside those Van Allen belts. Thus, clearly there's almost no limit as to how much shield depth you'll be needing, even though at the same time the additional shielding (unless it's hydrogen) will be having a reasonably negative impact by creating secondary radiation, thus it's essentially another lose-lose situation unless that shielding is sufficiently compiled of multi alloys and/or of lighter densities that'll suppress the issues of secondary radiation (heavy water is one of those options, though H2 is certainly best).
In other words, as for being just as exposed as EL4 or EL5, it simply wouldn't have been the external radiation that should have nuked those astronauts, although the nearly fully exposed EVAs certainly should have placed a great deal of icing on their already (131 mrem/hr) radiated cake. Fortunately, there were only moderate solar flares during the majority of Apollo missions, as such imposing less concern for being within the command module (more flares means creating less secondary radiation) however, still rather nasty towards anyone doing those EVAs. So, exactly tit for tat how they ever managed with receiving only the average of 35 mrem/day (Apollo 15 reporting in at a mere 20 mrem/day) is still not computing, as the bulk of their shielding (in all but less than one quadrant direction) was not worth 0.875" of solid aluminum (simply couldn't have gotten that much weight up there) and, those missions were of moderate solar activity or more so of solar minimums which are worse off for those being shielded.
Yet another conservative way of looking at what's out there, based again upon the official NASA con_x_dose1.pdf file and using the minimum solar potentials from the 590 km and the 705 km as for a shield density of 6 g/cm2 (X2 for remaining solar exposed because those altitude orbits are shaded/blocked by Earth half the time), then toss in what's supposedly at L2 for the same 6 g/vm2 plus X2 on that as well because, open space travel simply is not anywhere nearly as shielded as L2, thereby you would have to at least double the numbers, then top everything off by calculating something in for at least 6 hours worth of the Van Allen death zone:
590 km = 3.15^2 x 2 = 6.30^2 / 365 = 1.72 rem/day
705 km = 3.27^2 x 2 = 6.54^2 / 365 = 1.79 rem/day
E/S L2 = 2.85^2 x 2 = 5.70^2 / 365 = 1.56 rem/day
As you can see, just going by the above three items clearly indicates an average dosage rate of roughly 1.65 rem/day when situated behind an average of 6 g/cm2 (.875" of solid aluminum).
Van Allen = 2.61^3 x 2 = 5.22^3 / 365 = 14.3 rem / 24 = 0.596 rem/hr x 6 = 3.576 rem that needs to be added into the mission (whereas 10 hours could be more likely, as they certainly weren't making any tangent beeline as a straight shot [to/from] through all of that radioactive soup, nor were they returning all that fast because, they would have shot way past Earth if they were anywhere near their exit velocity).
Thereby, most conservatively speaking, a 15 day lunar mission had ott to have accumulated nearly 24.75 rem, that's not including any stinking EVAs, where if you'd care to toss in another 42 rem per 36 hours worth of 0.5 g/cm2 shielded EVA allotment, obviously either amount of 24.75 rem or 42 rem is considerable but survivable and, most certainly of the combined 66.75 rem per total mission exposure is still survivable though computes at something just a wee bit more intense than their reported 35 mrem/day (0.525 rem total) or 20 mrem/day as in one mission. True, I'm still doubling upon what's supposedly at Earth L2 when situated behind 0.6 g/cm2, although Earth itself and of it's magnetosphere are greatly sheltering L2 as compared to what's fully exposed at L4 or L5, either of which locations should be more likely of what's existing while on the moon and/or in between Earth and the moon, whereas in reality the L4/L5 environment may be somewhat closer to offering a ten fold boost, as there's no planet nor nifty Van Allen zone in between the sun and yourself, so that of my implicating 2 X L2 is hardly pushing things.
However folks and all other fellow snookered fools;
When all the smoke settles and those mirrors have been removed, what this all boils itself down to is rather adversely affecting upon that which is shielded, especially of anything shielded by the category of 10+g/cm2 worth and, even if you can't quite understand this sort of smoke and mirrors worth of reverse logic, perhaps you'll soon come around to realizing that a solar minimum (not maximum) is actually worse off as you increase upon any amount of shielding. In other truthful and if need be village idiot words, you can in fact effectively shield yourself from those relatively short term solar maximum events, as nothing beats having lots of mass. Such as for long term space travel, having a good amount of 100 g/cm2 will in fact do quite nicely, however, since it's not your solar maximums that are the bulk of the solar/cosmic realm, nor of having to fend off many other galactic activities and, since it's also NOT of what's external to whatever is shielding your sorry butt that's actually of what matters, but of what your entire body is having to deal with as secondary (TBI) radiation that's coming at you from all directions is what should be of concern and, guess what folks; it clearly looks as though your solar minimums are actually the most responsible for causing and/or inflicting far more secondary radiation exposure (at least 1.25 up to 2.5 greater) than of anything solar maximum related.
Thus the intended confusion is revealed;
As even with regard to those Apollo missions and of their remarkably low accumulation of a mere 35 mrem/day or less (that's including their extended EVAs) was simply not as it seems, since the actual radiation of concern would have been of what's happening within their command module and of the lander itself, where each were consistently being loaded up with great amounts of secondary radiation, especially of the CSM was even more so affected since the majority but not all of the Apollo missions were supposedly alluding those larger solar storm conditions, thus primarily being greater exposed to the more or less moderate solar minimums and, by their accomplishing such a mission were taking in far more secondary radiation issues than of any solar flare issues. As far as I can determine, remaining below the Van Allen zone of death (at least half the time shaded by Earth) and otherwise with the bulk of their CSM pointed at the sun is about the only viable method of those Apollo astronauts accumulating a mere 35 mrem/day.
First of all; We do need to understand that there's no such thing a an absolute solar minimum, other than for a few minutes to potentially a few hours worth of null time, as otherwise there's nearly always something nasty going on in the way of what's being preceded by visible sunspots, or otherwise indicated by other instrument methods provided by SOHO, TRACE or any number of Earthly based solar monitors, of detections which clearly indicate such areas of interest as holes or of perhaps gravity wells from which there's all sorts of nasty things happening. Thus the phrase "solar minimum" needs to be taken into proper context of being over a duration of time, as in per day, per month or better yet in yearly output, as in that way the typical space environment can best calculate a potential impact/risk upon the expected space travel environment without having to deal with every individual solar spike that's certain to impact a fully exposed spacecraft but not otherwise affect Earthlings residing below the Van Allen zone of death.
Secondly; The idea of banking bone marrow and perhaps devising a small safety shielded zone (achieving 2.75 mrem/day unless frozen bone marrow is less affected by radiation) for storing the bone marrow onboard a spacecraft is only the VL2 mission backup plan, such as in case there's EVAs involved and/or an unusually greater solar storm(s) impacting that obviously can't be avoided. As otherwise 100 g/cm2 worth of multi-alloy/composite shield had ott to more than pull the interior dosage to something well below the proposed average of 275 mrem/day, as if not, then even more shielding may become necessary as opposed to taking hits of 550+mrem/day.
The prospect of residing behind a sufficiently large space object (large rock per say) such as Earth (especially Venus ott to suffice for VL2) is another issue that offers at least one improvement as opposed to being situated at any L4 or L5, or of somewhere other that's fully solar exposed (24/7) where there's no solar exposure end in sight for receiving a greater potential of accumulated TBI dosage that'll only build over time, introducing loads of secondary (X-Ray) irradiation. Whereas the L2 position of Venus is at least 90% shielded by the planet itself, plus there's a little something of worth for its relatively thick and dense atmosphere of mostly CO2 and, of there being no apparent magnetic field (Van Allen belt) influencing the shadow pattern or L2 pocket safety zone as with regard to Earth L2. In other words, even though VL2 is roughly 25% closer to the sun, the VL2 location ott to be receiving nearly the same if not less cosmic/solar radiation than EL2 (certainly of not much more than EL2).
I've been informed that there are methods of cutting the secondary radiation factors, not to mention what a fairly substantial magnetic shield could manage, as long as the energy for sustaining such a field were within budget. I had only mentioned upon the magnetic shield as an option for the storage of banked bone marrow because, the physical amount of shielded material would be relatively small, thus would consume far less energy than as for having to protect the entire spacecraft, especially if we were talking about something the size or larger than ISS. Perhaps there's even a permanent magnet surround that could sustain a suitable magnetosphere containment zone (0.1 m3), sufficient for protecting the banked bone marrow from whatever secondary radiation exposure (this might become another prudent experiment to pull off, where perhaps this has already been accomplished, although so far I do not know of such).
I've also learned that strong cosmic and/or galactic generated rays (neutrons for example) are simply too energetic to so easily shield against and, that of typical structural shielding will only cause cascades of secondaries leading to greater exposure of the sorts of the more dangerous radiation (solar minimum is what actually imposes those greater secondary radiations by a factor of roughly 1.25:1 if not as great as 2.5:1 as clearly indicated by the most recent of official NASA publications). Though I believe it's been proven possible by other to reduce secondary radiation discharges by way of incorporating multiple layering of rather substantial alloy densities (such as titanium/aluminum/leaded-UHMW,,ditto,,ditto), achieving an overall multi-layer composite of 100 mm representing at least 35 g/cm2 if not of accumulating that to 1000 mm worth that ott to represent 350+g/cm2, of which should greatly improve the odds of long-term surviving worst case solar/cosmic and galactic events (charged iron particles included) as well as subduing the more damaging of radiation inflicted as secondary from solar minimum affecting the bulk of any long-term space travel exposure, that's including those relatively short duration Apollo missions since they were fully exposed as external to the Van Allen belts and little or no Earth shade. Although, getting that much mass into space is going to be a bit testy, not to mention expensive as well as for such creating 100 times as much Earthly trapped CO2 tonnage in the process. Even my personal pod/coffin idea is not exactly a light weight solution, where as the idea of locating a sizable asteroid (preferably already traveling in the general direction of desire) and of parking your space transport behind so that the rock or perhaps ice ball (ice ball being better for obtaining fuel as well as for creating lesser secondary radiation) is situated between you and the sun is about as good as it's going to get, unless you can somehow manage to excavate yourself into the interior/core of that asteroid.
My thoughts on establishing reasonable limits of 2 Sv per 2 year mission (most of which situated at VL2) is allowing for an average daily TBI dosage of 274 mrem. I believe I've come to understand, as long as certain DNA steroids and nutrition supplements are taken, I'm thinking that most will not be in need of receiving their own bone marrow, as their physiology/DNA will have somewhat adapted to the gradual build of radiation by taking advantage of whatever supplements being consumed so as to enhance one chances of surviving in spite of the radiation. Perhaps the limit might even safely exceed the proposed 2 Sv red-line before bone marrow injection becomes essentially necessary, as affording 4 or 5 Sv would certainly reduce the burden of shielding requirements and thereby greatly simplify matters.
As for going places extremely fast is obviously another option, thus cutting the duration of exposure is just as capable of saving the day as is packing along enough shielding. Although, making 0.1% lightspeed (300 km/sec, or roughly that's ten fold better off than anything we've currently got underway) is still a ways off and, even that's leaving the barn door nearly wide open should a solar maximum event take place or of perhaps being nailed by some other galactic super nova event, as either way your life expectancy is going to be negatively affected.
Otherwise folks; for the ongoing effort and subsequent objective of my obtaining a better understanding of what we're up against, especially of what L4 and/or L5 have to offer, I recently tried out the flowing search topics and, basically I got squat worth of specific numbers as opposed to some specifics from the FAA and even some from our USAF. Apparently as opposed to air travel, space travel radiation is supposed to be extremely complicated if not entirely mysterious. Perhaps it's what you and I don't know that will not hurt us, though somehow I don't think so.
Lagrange (L1) (L2) (L4) (L5)
solar space radiation index
Located within an above FAA reference: "Galactic Radiation Dose Rates at Several Locations in the Atmosphere: Effective Dose Rate Compared With Equivalent Dose Rate at Various Tissue Depths." Clearly this statement is alluding a certain amount of truth by excluding solar/cosmic contributions.
I've also learned that the shield density necessary to reduce the radiation dosage from an individual large solar proton event clearly indicates, in order to be sufficiently below receiving 100 rad (1 Gray) or 100 rem (1 Sv), for that you'll need at least 10 g/cm2 worth of shielding (again, it's the duration that's another factor going against your survival, such as a 12 hour event is not going to be a good sign). Oddly, I'm thinking that since 100 rem (1 Sv) is hardly a good thing unless you're planning upon obtaining a bone marrow transplant, perhaps we ott to be thinking that 100 g/cm2 might not be such a bad idea after all, where such shielding ott to be of multi-alloy layers in order to subdue those secondary radiation issues.
"The international unit of equivalent dose is the sievert. The sievert replaces the rem:" (the Gray being essentially the same dosage as the Sievert)
1 sievert (Sv) = 100 rem
1 sievert = 1000 millisieverts (mSv)
1 millisievert = 1000 microsieverts (µSv)
I've also located upon this information and re-interpreted such as taken from one of the above reference pages, referring to galactic radiation dosage: * Calculated dosage for February 1998, 80ºN, 20ºE (excluding upon any cosmic/solar flare input and as for residing within an aircraft).
Sealevel 0.04 uSv/hr = 0.004 mrem/hr
20,000 = 1 uSv/hr = 0.1 mrem/hr
30,000 = 4 uSv/hr = 0.4 mrem/hr
40,000 = 9 uSv/hr = 0.9 mrem/hr
And here's another interesting tid bit of worthy context regarding air travel radiation safety as intended for informing their industry workers:
"Consider a crewmember who declares pregnancy after 1 month and continues working 80 airborne hours per month on the long, high-altitude flight from Athens to New York City. The monthly dose to the conceptus would be about 0.57 millisievert, which would exceed the recommended monthly limit of 0.5 millisievert." (0.5 millisievert = 50 mrem)
Some of my lose cannon extrapolation, based upon seeing nearly a ten fold increase in doubling the altitude, as from 20,000' to 40,000' for the same 80ºN in estimating the maximum galactic (non solar flare) exposure potential at 80,000' = 100 uSv/hr = 10 mrem/hr.
Of being 80ºN (that's nearly crossing over the North pole) and subsequently having somewhat less of the Van Allen belt magnetic field shielding yourself from the galactic as well as solar/cosmic (non flare solar minimum) radiation is interesting because, if you should care to introduce what a few solar flares can contribute to the continual cocktail of background levels of galactic and cosmic radiation, then dosage only increases substantially, by perhaps another factor of 10 might be a conservative estimate, thus making the 40,000' altitude capable of delivering 10 mrem/hr into its passengers and crew and, of the purely hypothetical 80,000' travel exposure becoming a dosage level of the somewhat dangerous rate of 100 mrem/hr.
As you can discover for yourself (time permitting), if we were to apply the same 10 fold as a factor for doubling upon that altitude once again, at 160,000' we could obviously be taking on a potential of 1 rem/hr, although this difference in altitude shift should have been less dramatic in density loss, so that perhaps a doubling of altitude might only have squared (4X) the outcome, which would then become 400 mrem/hr at the altitude of 160,000'. Either way, it's more than obvious that as altitude increases, radiation dosage always increases (confirmed also by previously recorded shuttle flights reporting a 65 mrem/day which receives 12 hours worth of direct solar exposure, of which shuttles incorporate considerably more shielding than of any passenger airliner), whereas the only known exception to this rule of ever increasing radiation is with regard to exceeding the Van Allen zone of death, that which occupies a territory surrounding Earth from 1000 km out to 70,000 km, as from that 70,000 km point on out into free space, the surrounding radiation is less (500 to a thousand times less potent if unshielded and at least 15 times less if shielded at 10 g/cm2). Although, of being outside the Van Allen zone also represents that you're un-magnetically shielded from all other galactic and nasty dosages of mostly various solar generated flak of radiation, which is increasing by a somewhat more logarithmic factor of squared as one travels towards the sun, Venus being roughly a forth closer receives twice the bombardment as Earth.
Noting back upon the FAA reference as to Earth surface (80ºN) exposure of background galactic (essentially non solar) radiation as roughly 0.1 mrem/day, whereas it's clearly noted that as we lose our atmospheric shielding the radiation dosage expectedly increases exponentially as altitude is obtained (for example; 25 fold was achieved in the first 20,000'). Other reference sites have clearly specified that a surface exposure which includes a typically average solar impact will bring the 80ºN dosage of cosmic + solar radiation up to 1 mrem/day, thus this alone is another sufficiently clear indication of the 10 fold increase in background radiation when the solar and solar flare impact is included. Obviously during solar minimums, the combined galactic background radiation is diminished, though at any time there could be an event worthy of concern if you're situated in a sufficiently high risk location, especially if that's being in a space travel mode with few if any options for avoiding TBI exposure.
For the specific reference to flight safety, as per cosmic background radiation (excluding sun spot and subsequent solar flare input), this following link offers an alternative calculator that'll provide some basic knowledge. Obviously it's in the interest of their insurance companies as well as for protecting not so much the employees but of further limiting the liability and thereby limiting the payoff of the federal employee medical and disability benefits program, as to not overstate the gravity of radiation exposure, quite similar to the way our government consistently diminishes the relevant dangers of working with federally prescribed and/or applied toxins (agent ORANGE and so forth), of certain medications and even of worker safety levels of nuclear radiation hazards are those qualified by our government as three times greater than other nations (15 mrem/day as opposed to 5 mrem/day, and perhaps that's because Americans [especially white astronaut qualified folk] are more radiation tolerant; I don't think so).
Oddly, all of the user (village idiot) friendly topics pertaining to space related radiation dosage data has been sort of Arthur Andersen shredded into complex wording and/or kindly stripped from the publicly available pages, as if you were to search for any of the following topics you'll certainly get an overload of information plus truly great graphics at mostly taxpayer expense but, only vague gibberish related to what's out there in the way of cosmic/solar generated radiation that's existing under, within and otherwise of mostly nothing of what's beyond the Van Allen zone of death. I believe The most likely reason why you and I can't retrieve such basic knowledge is that it has become quite embarrassing, especially if you consider the typical Apollo mission (EVAs and all included) reported on receiving a mere 35 mrem per day and, of Apollo-11 (that was during a moderately high solar activity period) was definitely reported as receiving all of 12 mrem/day (I do believe you'd have gotten more from just climbing mount Everest), which just simply can't be so, especially if in fact that minimally shielded craft actually passed through our Van Allen zone of death (1000 km to 70,000 km), having to do so not once but twice and not by any shortest possible path but of a trajectory of creating roughly twice the combined 139,000 km trek, in other words, an overall path which had to be more like that of 276,000 km worth of their traveling to/from the moon while within the Van Allen zone.
For an example; Even if we were to be calculating as for less Van Allen travel-through distance, we're also talking of those Apollo missions making a somewhat less average travel-through speed that was a good deal below 10 km/sec. So, tit for tat, we might still be looking at roughly 7 hours worth of TBI exposure, while being situated in a not so terrifically shielded craft.
I'll again suggest that you try for yourself these following search topics, or of any variation thereof, such as trying your hand at including the terms of "space TBI" and/or "exposure dosage" or simply "radiation dose"; I'd be most interested in whatever you can turn up on Earth L4 or L5, as it's been somewhat perplexing that of all things of space exploration that could have and should have clearly recorded (in plain english or even metric SI) is radiation dosage, the hourly or daily level at various altitudes as well as in orbit (typically that's being 12 hours solar exposed unless you're referencing a polar/GSO or such as SOHO/L1). Instead of clear and concise data report that matches other gathered data (+/- 10%), we seem to get dozens of entirely obscure if not vague variations, using a slew of foreign terminology and/or standards skewed along the way by all sorts of conditions and/or circumstances, whereas on the surface of Earth we can generally walk outside on a typical day and measure such or accumulate dosage at a rate of typically 0.5 mrem/day or 21 urem/hr of cosmic and of solar background, higher yet as one travels North or South until that background sealevel reading is 1+ mrem/day (that's excluding any solar flares). At least I believe those FAA calculations are for real and believable, as they can be confirmed by actual measurements taken onboard an aircraft (too bad we still can't seem to say the same about those Apollo missions).
Earth L1 radiation
Earth L2 radiation
Earth L4/L5 radiation
space travel radiation
space exploration radiation
I realise the knowledge and subsequent truth is somewhere out there, even though my three remaining brain cells have simply come up empty or confused at best, though your unscrambled brain cells might actually be sufficiently outside the Borg collective box, and thereby able to figure out this challenging phase, as I still believe space radiation dosage is somewhat of an important risk factor, at least towards obtaining a fair understanding of what's possible to long-term survive, especially should we actually manage to place ISS or something similar on station at Venus L2.
As listed above, I did locate upon my own efforts (oddly never once was this NASA URL page suggested nor pointed out by any of my esteemed opponents) this following report, that along with some effort can be interpreted, though of concerning the Earth L2 location is still of what's representing a lesser dosage as being situated within an extensively Earth sheltered or of what classified as a somewhat cool radiation environment zone, as well as for that chart being associated as for a solar maximum computation of accommodating one annual solar maximum event, especially as also for L2 being less solar exposed (extensively sheltered by Earth and of Earth magnetic fields) as honestly compared to what's existing at Earth L4 or L5 representing relatively hot solar radiation zones.
This very official NASA document is otherwise referred to in my vl2-radiation.htm and vl2-iss-03.htm and vl2-iss-joke.htm documents.
Of great perplexing interest or concern; even if the average Apollo combined to/from pass-through speed were of 10 km/s, as that's worth 27,600 seconds or roughly 7.6 hours worth of TBI that's off most of the survivable scales unless you're sufficiently shielded by 100 g/cm2 of pure aluminum, as even at 1000 mills (1" thickness or 7 g/cm2) the dosage would still have been biologically sufficient to have survived in the short term (as based upon the solar maximum TBI dosage of 240 mrem/hr x 7.6 hr = 1.824 rem, whereas actually those hours of solar minimum should have been delivering the same protons but inducing secondaries at the very least 1.25 times greater, thus 300 mrem/hr x 7.6 = 2.28 rem) but, besides all of the confusion over solar maximum/minimum which takes a little getting use to, especially when the exact opposite is taking place if you're situated behind ordinary shielding, whereas of either exposure should have involved taking medications, perhaps transfusions and/or a bone marrow supplement injection upon their return in order to have been on the pretentative safe side of long term things to come (hair nearly always grows back) and do remember, we're still just dealing with the Van Allen zone travel excursion portion and not of the remaining lunar mission exposure which involved some rather extensive (near zero shielded) EVAs.
BTW; I believe we can understand that those Apollo command modules were not providing any uniform 7g/cm2, more likely a 75% coverage being at 5g/cm2 if you honestly included absolutely everything (except for what's in the direction of the SM which may have pushed that one directional shield density to something over 9g/cm2 and, of otherwise the majority being more like 5g/cm2 is clearly delivering a greater dosage than the calculations I based upon 7g/cm2. The lander portion itself was offering little more than aluminum foil (possibly 0.5g/cm2 if again you included everything) and, those EVAs (up to 72 hours worth per certain astronaut) offered scantly more than physical protection from all of that radiated, bone dry and of otherwise mysteriously clumping lunar soil that was supposed to reflect at roughly 10% (btw; that's nearly like new asphalt performance as far as any reflection index goes, though so many of their photos having those white moon suits as reference were clearly reflecting the lunar surface at or above 50%, that's including the majority of them there hills and rocks, though I suppose within the Sea of Tranquility, they could have landed in the one and only uncharted "lunar white zone").
As I've stipulated this before; those Apollo missions simply do not compute, at least not by way of the more recent reports on space radiation, whereas climbing Mount Everest computes quite nicely as delivering more mrem/day than reported by most any of those Apollo missions.
For no apparent good reason, information pertaining to what's out there in the way of various radiation hazards (galactic, cosmic/solar of gamma, protons, electrons and don't forget those charged ion and even a few trillion of those testy iron particles, some of which traveling at near light speed) is intentionally confusing if not missing altogether, either that or I'm simply the village idiot that's not smart enough to figure it all out. This ruse of confusion is I believe been quite intentional, as orchestrated and alluding to otherwise uncovering truths associated with the Apollo ruse/sting, where the "sting" part being the snookering of taxpayers that were essentially taken to the cleaners and damn near into WW-III. The "ruse" part is having to do with pulling one over on the USSR, which obviously meant the absolute snookering of all of America and of the entire world, including the likes of Walter Cronkite and even myself.
It seems additionally perplexing that of common satellite PV cells and of their electronics being driven by such must seemingly qualify relatively high on surviving a good deal of radiation, though for some odd reason those very same radiation hazard issues are being made nearly meaningless towards human biology, as though DNA/RNA are somehow sufficiently immune to what's otherwise measurably destructive to insufficiently shielded electronics. Even of those robust satellite PV cells are typically degraded by an average of 5% per year, more so if there's a good deal of solar flare issues, by which a fully exposed satellite with all of its solar panels broadside to the sun have died on the spot. You don't suppose that same level of dosage wouldn't have been detrimental to humans?
Acquiring a clear understanding of various space radiation considerations, especially of those variable flare induced sorts coming off our sun, is not an option, but of something rather imperative, as even for configuring a robotic Venus L2 (TRACE-II VL2) communications relay platform (shielded 90% from the sun) to survive and perform it's functions for many years (preferably decades) will require a great deal of solar/cosmic plus whatever flare shielding for critical components, that's including PV cells and of whatever CCD and/or single channel photon detectors. One proven solution for those PV cells is to simply over provide for their capacity as well as for having multiple arrays or banks that can be switched on or off, thus anticipated degrades can be managed in order to support a 10+ year mission. Rotating PV panels so that they're on edge to the solar flux (having a sufficient mass of aluminum facing towards the sun) is another management tool by which improves their survival (too bad our human biological cells can't do the same thing).
There's no longer any doubt in this village idiots' mind, of the benefits of bringing along your own privet little Van Allen belt, as a magnetically forming shield is not the sort of shielding that's going to cause the same degree of secondary radiation issues as any artificial mass that's surrounding your butt, as well as it's not something that's going to choke off all of Earths' humanity and/or greenhouse whatever's left of Earths' humanity with all the subsequent CO2 that had to be artificially generated in the process of getting all that conventional radiation shield mass into orbit. So, perhaps a little of both is what's needed, either that or merely sending off those energy efficient robotic missions to places like hell (VL2), accomplishing whatever scanning, probing, communicating and/or delivering our warlord intimidations from a very safe distance, at a mere fraction of a percent of accomplishing anything manned, as that's almost as insane as our going to frozen and irradiated to death Mars or looking for invisible WMDs.
Good grief folks, this mad-scientist idea of sending ISS off to see the wizard of Oz at Venus L2 was just a thought, not exactly my idea of what we ott to be doing such, in place of utilizing a relatively small and efficient robotic satellite, like I've mentioned of creating a hybrid TRACE-II would be more than sufficient, at not 0.1% the cost, if that.
Some other pages that'll soon need updating are "vl2-iss.htm" and "vl2-iss-01.htm" and "vl2-iss-02.htm" and "vl2-rocket.htm".