StellarXplorers
  • Home
  • About
    • Alumni Network
    • Info Sessions
    • Branding Guide & Logos
  • Competition
    • Team Registration
    • Mentor Registration
    • Training Materials
    • Current Season >
      • Announcements
      • Dates & Fees
      • Registered Teams
      • Rule Book
      • Scores
    • Past Seasons
  • Camps
    • Purchase the Camp
  • Media
    • Outreach Materials
    • StellarXpress Newsletter
    • News & Press Releases
    • Videos
  • Sponsors
    • Become a Sponsor
  • Contact Us
  • Donate

Scores 

A list of all team scores from the competition round are posted to this page 48 hours after the individual teams score sheets are emailed to the team directors. Scores are typically posted to this page the Thursday following the end of the competition window.
​
PRACTICE ROUND 1
QUALIFICATION ROUND 1
PRACTICE ROUND 2
QUALIFICATION ROUND 2
PRACTICE ROUND 3
QUALIFICATION ROUND 3
TEAM TOTAL ROUND SCORES (ADVANCEMENT TO SEMIFINAL ROUND)
SEMIFINAL ROUND SCORES (ADVANCEMENT TO NATIONAL FINALS)
Picture

Semifinal Round (SFR) Round Recap

SCENARIO OBSERVATIONS
SOLVING THE SCENARIO
We shook things up a bit for the Semifinal Round (SFR) again this year. In the past, the SRF typically consisted of elements from QR1 (Orbit Planning) and QR2 (Satellite Design); but this year, we replaced the QR1 element with QR3 (Launch Operations). It appears that the teams handled the change well.
 
Most of the tasks in the SFR have been seen in previous rounds, including hydrazine usage, budgets, CubeSat usage, and launch weight restrictions. This round proved to be quite a challenge for a number of teams, but it was designed that way since this round decides who qualifies for the National Finals. The teams were given a lot of tasks to accomplish, and many teams used the entire six hours (and a bit more) to complete the round.
 
The teams had to evaluate subsystem performance verses the customer’s (U.S. Forest Service) requirements. At the same time, they also had to consider launch vehicle mass constraints and budget limitations. It was a significant task, but they also had to select a launch vehicle to place one of the mission satellites into orbit. Teams should have treated these tasks separately, in accordance with the notes provided, and not necessarily carried over data between the two tasks.

SATELLITE DESIGN
 
For this portion of the scenario, the most important requirements were to:
  • Continually monitor the Fire Area (California, Oregon, and Washington) with the low-resolution cameras
  • Capture the most high-resolution data from the Fire Area
  • Send the data to the Firefly Ground Station (FGS)
 
The Forest Service did want the best possible high-resolution images of the Fire Area so there were bonuses for higher resolution data and a stable satellite. There were also additional points to be earned for staying below the $59,250,000 budget for each satellite. However, if any part of the Fire Area was not monitored during the evaluation period, there was a penalty incurred (1 point lost for every 65 minutes). Engineers must make these types of tradeoffs all the time.
 
The highest possible score for this portion of the scenario was 56.557.

This solution had the following equipment selections:
For all three satellites use:
  • High-Resolution Camera – GC 130
  • Data Recorder – DDS 150
  • High-Resolution Data Transmitter – ST-1025
  • Attitude Control System (ACS) – LAC-401
  • Solar Panel Model – Horizon 250
  • Hydrazine Load – 282.3 kg
 
For Firefly1 and Firefly 2, use Low-Resolution Camera – VOS-550 and Solar Panel Size 1 m by 0.867 m.
For FireFly3, use Low-Resolution Camera – VOS-525 and Solar Panel Size 1 m by 0.847 m.
 
Unfortunately, no team selected this solution.
 
The most prominent scenario errors we observed were the following:
  • Failure to continually cover the Fire Area
  • Failure to load enough hydrazine into the satellite
  • Failure to size the solar panel large enough to power the satellite
 
We also saw many solutions that exceeded the launch vehicle lift limit (a maximum of 1,065 kg) and the program budget. It’s likely that these errors were a result of failure in one of the three areas mentioned earlier.

Fire Area Monitoring
The requirement was “to provide continuous monitoring of the entire Fire Area by the three low-resolution cameras.” There are two methods of determining which set of three cameras to select: the “Eyeball” method and the “Scientific” method.
 
In the “Eyeball” method, the team would need to literally “watch” the low-resolution coverage lines in STK’s 2D window to “see” when any point of the Fire Area was not covered by a low-res camera. Since the satellite was approaching the Fire Area from the south and departing the area to the south, the areas along the northern border of Washington are the most likely to NOT be covered during a transition from one satellite to another. While this method will work, it is not the most precise way to determine area coverage.
 
In the “Scientific” method, shown in the photo below, the team would use STK to insert points (WA1 and WA2) along the northern border of Washington that would be “not-covered” during a transition, as teams were allowed to insert items into the VDF during this scenario. The team would then use STK’s “Access” function to determine when the points are not accessed by of the three low-resolution cameras. In this case, there are 64 different combinations of camera options that could have been used. But some of the camera options with narrow fields-of view (FOVs) should have been quickly eliminated.
Picture
​Firefly 1 VOS-525 camera       Firefly 2 VOS-525 camera

Here are the combinations that provided the best “Total” coverage during the 7-day period:
  • Firefly1 - VOS-525; Firefly2 - VOS-550; Firefly3 - VOS-550: 0 Minutes Missed
  • Firefly1 - VOS-550; Firefly2 - VOS-525; Firefly3 - VOS-550: 0 Minutes Missed
  • Firefly1 - VOS-550; Firefly2 - VOS-550; Firefly3 - VOS-525: 0 Minutes Missed
  • Firefly1 - VOS-550; Firefly2 - VOS-550; Firefly3 - VOS-550: 0 Minutes Missed

​Hydrazine
Failure to load enough propellant for the mission was a common error. Two factors determined the N2H4 load for the mission: selection of the Attitude Control System (ASC) and the requirement to use 50 kg of N2H4 to raise the satellite’s circular orbit to 35,788.1 km.
 
Each ACS has its own daily N2H4 requirement from 0.046 kg to 0.058 kg per day. To calculate the fuel needed by the ACS, multiply the daily requirement by the mission life requirement of 12 years. In 12 years, there were 4,383 days (12 X 365.25). Remember: a year in space is 365.25 days long (which is why we have a leap year every four years!). Several teams came up a few days short because they did not account for the extra 0.25 day each year. The load for each ACS was LAC-410 – 201.618 kg; LAC-405 – 214.767 kg; LAC-401 – 232.299 kg; LAC-400 – 254.214 kg.
 
It appears that a common error of the fuel load calculation was the failure to add the 50 kg needed by the satellite engine to raise the orbit after separation from the launch vehicle. This requirement was listed on Page 7 of the scenario booklet.
 
The final hydrazine load for each ACS was LAC-410 – 251.618 kg; LAC-405 – 264.767 kg; LAC-401 – 282.299 kg; LAC-400 – 304.214 kg.

Solar Panels
The selection of the model and size of the solar panel was based on the number of Watts (W) of power needed to keep the satellite running and out of “Safe” mode. The first step was to calculate the total power required by the satellite. The total was the power needed by the equipment already selected by New Venture and the equipment selected by the team. This number can vary between 215.1 W and 334.6 W. This number was the amount of power needed from the solar panel. To calculate the amount of power the panel needs to collect from the Sun, divide the power needed by the efficiency of the team-chosen solar panel model.
 
For example, if 300 W was needed for the satellite and the team selected the Horizon 250 Panel with 21% efficiency, the panel would need to collect 1,428.571 W (300 W ÷ 21%) from the Sun to generate 300 W for the satellite.
 
As stated in the scenario booklet, the Sun puts out 1,300 W per meter2. Using our example from above, divide the power needed from the Sun (1,428.571 W) by the power available from the Sun (1,300 W/m2) to determine the size of the panel needed (1,428.571 W ÷ 1,300 W/m2) is 1.099 m2. Assuming the height of the panel was 1 meter, then the width would need to be 1.099 m. The mass of the panel would be height × width × panel thickness × aluminum density (1 m × 1.099 m × 0.006 m × 2,700 kg/m3) = 117.804 kg. Cost of the panel would be height × width × cost per meter2 (1 m × 1.099 m × $0.24 m2) = $0.264 M for our example.
 
As mentioned above, other errors seen in the scenario included going over the $59,250,000 budget (13 teams) and exceeding the 1,065 kg mass limit of the launch vehicle (23 teams.), However, 96 teams were under the satellite budget. Excellent!
 
Finally, the collection of the high-resolution data could be calculated by simply running an access report against the state of California. The state’s access times began and ended before and after the access times for Oregon and Washington. One last note: Firefly 2’s final high-resolution data collection was lost because the 7-day scenario period ended before the data could be downloaded to the ground station.

LAUNCH OPERATIONS
 
Teams once again had to use the launch vehicle performance tables to determine the maximum mass that each launch vehicle could place on orbit. The maximum payload mass for a satellite placed into a 435 km circular orbit with an Inclination of 40° for each launch vehicle: Alpha 5 – 2,399.6 kg; UpGoer 1 – 1,305.0 kg; Himmelsflieger X1 – 2,215.0 kg and Sutātoraberā H2 – 2,293.5 kg. Thirteen teams (only 11% of teams competing) exceeded the maximum weight limit of the rocket. Very Good.
 
Here’s a quick summary of each launch vehicle’s operation:
Alpha 5: This was the highest priced launch vehicle. However, its low insurance rate, high-capacity CubeSat dispenser, and its ability to make two orbit altitude changes made it an attractive option. However, the vehicle’s acoustic load (130 dB) exceeded the satellite’s max allowable load (125 dB). Therefore, acoustic blankets were needed to protect the satellite. A shock kit was not required.
 
UpGoer1: This was the lowest cost option of the four vehicles available for the missions. It did not need a shock kit or acoustic blankets. However, it had the lowest performance. It could not change its orbit altitude. Its CubeSat dispensers were limited to only 18U. It was also considered to be high risk and carried a large insurance rate.
 
Himmelsflieger X1: This was the second lowest-priced launch vehicle. However, the vehicle’s acoustic load (135 dB) exceeded the satellite’s max allowable load (125 dB). Therefore, acoustic blankets were needed to protect the satellite. Also, the vehicle’s shock load (2900 g’s) exceeded the satellite’s max allowable load (2500 g’s). So, a shock attenuation kit was also required. It could do only one change to its orbit altitude. It provided two 36U dispensers for the CubeSats. It also had a medium insurance rate. There was a large shipping cost to ship the satellite to French Guiana.
 
Sutātoraberā H2: Its low insurance rate made it a very attractive option. It did not need acoustic blankets, but the vehicle’s shock load (2700 g’s) exceeded the satellite’s max allowable load (2500 g’s). So, a shock attenuation kit was required. Its four 24U CubeSat dispensers and its ability to make two orbit altitude changes made it an attractive option. However, the cost to ship the satellite to Japan was very high.
 
Here’s a summary of the teams’ launch vehicle choices: Alpha 5 – 28; UpGoer 1 – 8; Himmelsflieger X1 – 8; Sutātoraberā H2 – 73. 

Here were the best launch vehicle option and best possible score received during this portion of the scenario:
 
Use the Sutātoraberā H2 launch vehicle. Install a shock kit. Acoustic Blankets were not required. Install four 24U CubeSat dispensers. At 435 km, deploy four 1U CubeSats, five 3U CubeSats, and two 6U CubeSats. Raise the orbit to 850 km and deploy five 1U CubeSats and four 6U CubeSats. Raise the orbit again, this time to 1,250 km and deploy six 6U CubeSats. Final Launch Cost: $35,628,000. Score: 31.860.
There were a couple of additional CubeSat combinations that achieved this score as well.
​
ACADEMIC QUIZ
As expected, the Semifinal Round had the best quiz scores of the season, since this round has the top 117 teams. The Quiz score average was 19.359 out of 20 while previous qualification rounds typically had an average of ~16.8. Seventy-seven teams scored 100% (20 out of 20).

TOTAL SFR SCORES
The highest possible TOTAL Score for SFR (satellite design + launch operations + quiz): 108.417. No one reached this score.
However, one team was very, very, VERY close with a score of 108.416!
Average SFR Total Score: 70.753.

A FINAL THOUGHT
 
The Semifinal Round of StellarXplorers is an “all or nothing” event. All participating teams start with the same score – zero – but only the ten teams with the highest scores advance to the National Finals. This is a situation where, if a team really wants to reach the Finals, they need to “go for it” by pushing for a solution as hard as possible. We are not recommending that teams be foolish, but to become a bit more assertive in developing a solution. We observed many instances where teams had hundreds of kilograms of unused lift capability remaining in their launch vehicle. Teams should push to use all that capability while retaining a safe margin. Playing it “safe” will probably result in a good score but will not be a top ten score.
 
Following this policy can result in an amazing finish to the season. For example, there are teams that entered the Semifinals ranked between 70th and 80th place that will be qualifying for the Finals. It can be done! So, master your interpolation skills, learn to calculate fuel usage, know satellite power requirements, develop CubeSat deployment options, and be a bit more aggressive, and we just might see you at next year’s National Finals.

The top ten teams with the best SFR scores will advance to the National Finals Event in Denver. We plan to notify all teams of their status by Friday, February 27, and will post the combined scores on our website at that time as well.
 
For the teams that do not qualify for the National Finals, we hope you learned about space operations and more importantly had fun competing. We encourage you to continue learning about space and aerospace engineering. As hopefully you can see, it’s a challenging and exciting field. Have a great rest of the school year and a relaxing summer. We hope to see you all again next year for lucky StellarXplorers 13! Registration for the new competition season will open on May 1st, so mark your calendars!

IMPORTANT REMINDER – DESTRUCTION OF STLX COMPETITION MATERIALS ​
One last reminder, please destroy ALL StellarXplorers materials you received during the practice, qualification, and semifinal rounds this year. This includes all competition scenario booklets, team created files, SFR STK VDF files, round score sheets, round emails, and team-produced round documents. All must be collected and destroyed or deleted prior to March 6, 2026.

Continuing to hold onto materials from previous rounds, after the start of a new round, is a violation of the StellarXplorers Rule Book. Help us to ensure the integrity of the Competition by complying with these directions.   
​
​​​
PROCEDURAL OBSERVATIONS
​​Team Numbers
Your team number (STLX12-XXXX) is very important during the StellarXplorers competition. It is how we track your results, store your files, and most importantly grade your solutions.

In SFR, 6 teams generated 6 instances of bad team numbers on either the start form or scenario component selection form. 

Our automated grading system looks for your team number in order to launch the scoring process. It looks for precisely 11 characters (STLX12-XXXX) to find and score your file. If there are more than or less than these 11 characters, the grader will abort the scoring process for all teams, and we must manually repair the incorrect team number. For example, say your team number is STLX12-0987.  Typing anything other than STLX12-0987 (even accidentally putting an extra space in front of the team number) will cause the grader to abort. The team number format is case sensitive, so be sure to use UPPERCASE letters. 

With regard to VDF uploads, it's important that teams follow the directions provided when it comes to doing their upload and naming their files, otherwise it will cause the grader to fail. The upload process generates the following unique file naming convention for team STLX12-0987: FirstName_LastName_UploadedFileName or QR1_STLX12-0987_QR1_STLX12-0987.vdf. The characters before the first underscore are whatever was entered in the First Name field, the next string of characters before the second underscore are from the Last Name field, and the remaining characters reflect the actual uploaded file's name. 
 
To help you avoid this mistake in the future, view the “Penalty Paper” to see a list of all the wrong ways (and the only right way) to provide your team number. This document also outlines other common penalties and how to avoid them. If you need clarification, contact [email protected].
PENALTY PAPER PDF
In all future rounds, teams will be penalized 5 POINTS each time they submit something with an incorrect team number. This includes the start form, academic quiz, solution upload/submission form, and/or VDF filename. It is imperative that you provide the correct team number in the correct STLX12-XXXX format.
 
And as always, include your team number in the subject line of all messages to us. It makes it easier for us to find you in our database and to file the information that you send to us.
 
Running Systems Tool Kit (STK)
As emphasized in prior communications, teams should load STK 13.0 Premium (which includes the Planetary Data Supplement) onto their computers and test that the software is operating successfully BEFORE STARTING THE ROUND. By “before” we mean days before, and also immediately prior to starting. We cannot guarantee staff support for last-minute technical issues, and teams will not be given any time relief if they do not have STK running on their computers at the start of the competition period. If needed, sample VDFs are available with our Training Materials, and can be used to test the software.
 
For those still needing licenses, please reach out to the StellarXplorers Program Staff as soon as possible to request them.

Changing STK Fixed Variables
Teams should not change any of the Fixed Variables in STK. These variables include but are not limited to: Start Time, Stop Time, Propagator, Step Size, Coord Epoch Time, etc. Additionally, teams should not change any component on the satellite, a target, or a facility on the ground. We use a team’s VDF to determine that team’s score. Therefore, these variables must remain the same so that we can fairly evaluate every team (this is why we impose severe penalties if these items are changed). Teams should only change the orbital elements of the satellite.
 
Several teams changed the Orbit Coord System and the Propagator. These changes can happen when a team uses the STK Orbit Wizard. If you use Orbit Wizard, be sure to check that the Fixed Variables, as shown in Appendix I of the scenario booklet, have not been changed.

Incomplete Submissions
One team failed to submit their scenario component selections required for StellarXplorers staff to score their solution. Three teams failed to submit their quiz, leaving up to 20 points on the table as part of their overall score.
  
​StellarXplorers Space Foundations Course (Nova Space)
Please make sure your team has access to the Nova Space learning platform BEFORE starting the round. Utilization of this course is instrumental in preparing for the academic quiz. If you need help accessing the course, please contact the StellarXplorers Program Office.
 
Confirmation Emails
Team directors will receive automated confirmation emails upon submission of a team’s start form, scenario component selections (in Round 2, Round 3, and the Semifinal Round), and academic quiz (when applicable). We are unable to provide automated confirmation emails for receipt of VDF submissions, but the team will receive a confirmation message on the screen once the file is uploaded successfully. 
 
Confirmation emails are sent to the email address provided in the “Team Director Email” field of the forms. To ensure there’s no delay in the delivery of these automated emails, competitors should pay close attention to the email address they are providing. There were several instances of incorrect email addresses this round.
 
SFR Prep and Start Emails
SFR’s Prep Email will be sent out at 1:00 PM ET on Friday, February 6 and SFR’s Start Email will be sent out at 12:00 PM ET on Wednesday, February 18 (the day prior to the start of the competition window). To prevent delays in preparing for, or starting, the round, please contact [email protected] ASAP if you haven’t received the email by 2:00 PM ET on either day.

Finalizing Team Rosters
Team rosters are now final. No changes can be made to the roster other than removal of a member for the remainder of the competition.

About

AFA STEM Programs
Sponsors
Contact Us
Privacy Policy

​

Competition

Register
Current Competition
​Training Materials
StellarCamps
Host a Camp

Media

Outreach Materials
StellarXpress Newsletter
News & Press Releases
Videos
Picture
© COPYRIGHT 2022. ALL RIGHTS RESERVED.
  • Home
  • About
    • Alumni Network
    • Info Sessions
    • Branding Guide & Logos
  • Competition
    • Team Registration
    • Mentor Registration
    • Training Materials
    • Current Season >
      • Announcements
      • Dates & Fees
      • Registered Teams
      • Rule Book
      • Scores
    • Past Seasons
  • Camps
    • Purchase the Camp
  • Media
    • Outreach Materials
    • StellarXpress Newsletter
    • News & Press Releases
    • Videos
  • Sponsors
    • Become a Sponsor
  • Contact Us
  • Donate