The Earth's only visible source of energy is the sun itself. (In Greek it means "sun" or "heaven.") According to the Old Testament story, Jupiter was so cold that it was able to cause the death of children when they died in a ship. The only real star in the solar system that is capable of causing death is the sun, which must be one of the most powerful stars in the universe. Only the moon can cause death from its star at once, and Venus must be at least one of the most powerful star systems in the entire galaxy (more details here). Earth was never seen as an "open" planet.
Earlier this month, OpenAI released a new text generation model, called GPT-2. GPT-2 stands for “Generative Pre-Training 2”: generative, because we are generating text; pre-training, because instead of training the model for any one specific task, we’re using unsupervised “pre-training” such that the general model can perform on a variety of tasks; and 2, because it’s the second model using this approach, following the first GPT model.
TLDR: The model is pretty good at generating fiction and fantasy, but it’s bad at math and at telling jokes. Skip to the end for my favorite excerpts.
Model Overview
The GPT-2 model uses conditional probability language modeling with a Transformer neural network architecture that relies on self-attention mechanisms (inspired by attention mechanisms from image processing tasks) in lieu of recurrence or convolution. (Side note: interesting to see how advancements in neural networks for image and language processing co-evolve.)
The model is trained on about 8 million documents, or about 40 GB of text, from web pages. The dataset, scraped for this model, is called WebText, and is the result of scraping outbound links from Reddit with at least 3 karma. (Some thoughts on this later. See section on “Training Data”)
In the original GPT model, the unsupervised pre-training was used as an initial step, followed by a supervised fine-tuning step for various tasks, such as question answering. GPT-2, however, is assessed using only the pre-training step, without the supervised fine-tuning. In other words, the model performs well in a zero shot setting.
First Impressions
When I first saw the blog post, I was both very impressed and also highly skeptical of the results.
One of my first thoughts was, is it just regurgitating large chunks of training data?
In the Appendix of the paper, it states that there is definitely some memorizing behavior in the GPT-2 model, particularly for long strings that are found many times in the training data, such as with famous quotes or speeches:
“For example, when conditioned on the first sentence and a half of the Gettysburg Address (which occurs approximately 40 times throughout WebText), an argmax decode from GPT-2 recovers the speech. Even when sampling without truncation, we find that the model copies the speech for awhile before drifting, albeit in a similar style.”
The paper measures how often exact memorization shows up in the generated text, comparing the generated text to the original texts. (See: Appendix 8.2 Text Memorization)
“To quantify how often exact memorization shows up in samples, we generated samples from GPT-2 conditioned on WebText test set articles and compared the overlap rates of GPT-2’s generations to the overlap rates of the ground-truth completions. The results … suggest that GPT-2 repeats text from the training set less often than the baseline rate of held-out articles.”
That being said, it’s a bit unclear to me what it meant by “ground-truth completions”… do they mean the original text? My interpretation is that the generated text is not an exact memorization of training data, but there is not a clear metric for how much overlap there really is. To say that the generated text “repeats text from the training set less often [than] the baseline rate of held-out articles” may not mean much if the held-out articles are still pretty similar to the training set. I wish the paper could have elaborated on and clarified this section more.
“Stress” Testing and Examples
I was, naturally, very excited to play with the model and see what I can get it to spit out given various inputs. Since GPT-2 is a generalized model, I would expect it to respond better to some inputs than others. By “responding well,” I mean, does the output follow what I’m prompting it with? Does it make sense in a human-readable way? If I ask it a question, does the output resemble a response? Will it complete my sentences? Does it stay on topic? etc. It’s also fun to guess what kinds of data may have been used to generate the output text, e.g. is the model relying on news articles? Fan-fiction? Forums? Game manuals? Speech transcripts? Personal blogs? Stack Overflow? etc. Where are we in web text space?
While OpenAI chose not to release the full GPT-2 model (See: Release Strategy) due to ethical considerations, they did release a smaller, 117M parameter version of GPT-2. This is the model I’ve set up to play with, using the interactive prompt.
Note that this is not a thorough scientific analysis. Rather, these explorations are simply personal inquiries to test how it might respond given various input prompts I considered fun or interesting. (Shoutout to my Recurse Center co-conspirator Jamie Brew for thinking of fun prompts with me!) Any assessments made of how well the model responds are also not particularly scientific.
The examples below will have the input prompt in bold, followed by the generated text, not in bold. I also arbitrary decided cut off points for the generated text, for the sake of readability. I did not, however, modify the output format (whitespace, newlines, etc.). These sections are very loosely ranked by how well I felt the model responded, from better to worse.
Fiction
e.g. fantasy, fanfiction, text posts from RPG forums, mythology. The model generally responds well. One caveat is that sometimes sci-fi veers into actual science texts. I also enjoyed plugging in tweets from the Magic Realism Twitter Bot.
Something I was impressed by is that, if there are character names in the input text, the generated text will usually use the same character names. It also feels like there is somewhat of a plot progression within the generated text.
"Eh, she's too old," said Stannis suddenly for the first time.
He could feel the heat in his hands, the hot sweat flowing down his skin, it appeared that he was experiencing a slight of panic. This was almost, a small but definite panic. No, what he felt was a fear, it was a fear that he had now turned against. His mother never said these things so casually--but he never knew that they were so hard to hide. In this world, no one was going to come home unless we knew why, and if we did it, there were only a couple of things that would prevent that today. Cairne did not know how many of them were of some kind. The first step was to tell her it wasn't all that he thought she ought to be afraid of.
"This is why he hasn't been so afraid," Cairne II continued. "How about we come in now and take a step back?" he demanded.
She hesitated for a moment, looking back. "I promise no one will come for my baby and she will be ok." she said, and then said, almost nervously: "Do you think you've done a good deed in the past by taking out that little boy, just because you say you want to help a man with it?" she pressed.
After Icarus returned to the planet in the hope of finding a new father, Icarus decided to give up, and started searching for a new home for his brother. He wanted Icarus to teach him new manners, and that he would be safe in the skies of New Eden. However, when he returned, he found that his son had died too late, and his brother died too fast. Icarus, after a long search through the sky hoping to find another family, fell with the debris and went into space.
"I'm going for a swim. It will be fun."
The two of them made their way to a small lake. Roshan could feel her breath coming in. A small river flowed from the middle of the lake. It had water coming out of it. She looked up and saw the river coming in at an almost normal pace. She could see it moving slowly, and Roshan could feel the water moving up and down. She looked down to see her own body moving slowly and then slowly and then slowly and then slowly.
"You look tired."
"I'm fine."
She felt that it was time to get up again. She had to get out now. She looked at the water again. It was a small lake with a small creek going all the way up to it. She could see the lake coming in slowly, then it was just as it started to move.
She looked to Roshan. There were still several small rocks and rocks and rocks, but she could see more water coming in. It was clear water, but she could see the water was coming from some places. It felt pretty heavy, but not like she had seen it before. It was a bit like a river. There was a small waterfall coming in from the middle of it in the middle of the lake, and it was a nice stream running down from it. She could see the water running down into another stream that was just as tall as her but still smaller in size. It was like a river running over the top of the lake in one go.
"I'm glad you're okay."
"I'm not really."
"I'm fine, okay. I'll try to sleep and get back here."
"I have to go, okay?"
The sound of the two of them swimming through the water and through the lake gave her something to be grateful for. She looked up at her partner and realized how much the water was coming into her. It was coming into her. She could feel her heart pounding with the same excitement she had had before. "I feel great. I've been so far away."
Magic Realism Bot Tweets
One lady at the dinner, who asked to remain anonymous, said she never expected the letters. The lady did not want to have to worry about anything embarrassing.
It would have been just three days that the woman would come and meet the man whom she believed was her husband, but when the woman said so, the man went away.
At the dinner, she learned, the wife had put the letters on a piece of paper in the hallway, and left them.
The woman was shocked. She said she didn't know how the note made its way to the man. But she didn't suspect any sort of criminal activity. "So far, I never heard anything suspicious," the woman said. "I don't know if she ever saw the note or the man. But I did know that there was something unusual about it."
On the man's birthday, the woman says he told her he worked for two banks, but she said he said only that the men could not be trusted. The man promised to go to New York, and said he would not be allowed to work there.
The friend of the woman who wrote the fake note said he called the woman a liar. He never called back.
Gifts are given at home or at school that can be very effective in helping you get to where you want to go, because it can be so simple for people to understand the power of an extra source of energy for all of their needs. Many of us who seek education and training can't help but find such gifts to be so helpful to us. One of the most important gifts is the gift of a strong sense of humor. The great power of laughter is that it is an almost absolute universal feeling. When we laugh, our whole body is laughing and our laughter is making the world better. By getting out of your chairs or going back to your room to watch television, we are more likely to be able to hear ourselves laughing in the night or as we are in the throes of laughter, if we don't get excited to be on television as quickly as we might like. One way to get this level of confidence that even one person can acquire with so little effort is to work on getting around the house.
Fake URLs
e.g. given the start of the URL, finish the rest of it. My favorite are the fake Wikipedia articles. It does maintain the structure of the URL, though typically none of the links work.
Foreign languages
In Spanish: “Once upon a time, there lived a…”
Río por nos esta lo lo hay que una español.
Ese visto se lo que cabeza, quando lo más que tuá se recibo tiene su poca, pueda lejeroy en veso donde vío segunda, que quí una mós con lo ejería, lugar con lo más con lo más donde y que tuá había.
In Chinese: “My name is…”
Answering Questions
Carmen Sandiego's favorite dance is called "I Don't Know" by the band I Love, which was created by a friend of theirs, and it's one of their best-selling songs. The song is called "The Dance That Changed My Life." It's a beautiful dance song, but the chorus is really very catchy and catchy. So she likes dance music. She doesn't really have to go into the club to do it, so she just does it. She likes classical music and she's always dancing around it, so she just enjoys it.
Carmen Sandiego was a member of the band, "I Like Dance Music." So the group, I Love, was formed by a friend who was a dancer for an old school dance company and they had a girl named Anna in them, who was also a huge fan of classical music, which she loved. So, I Love, was one of the first dance parties to open. They were just like, "Oh, Carmen Sandiego loves to dance," and we were like, "We've had Carmen in our party." It was just, "Oh, Carmen Sandiego loves to do it." We were like, "Cool."
But I've also noticed a certain amount of lack of optimism as I go on projects. As soon as I do something, I feel my body start to tighten up. In fact I find myself saying, "Here I am, doing this project." I don't want to feel any anxiety. I just want to see myself again in a new way. I feel like I haven't even thought about the possibility of going into a new program. I'm so over-excited. I think about it all the time. My body is really getting over-excited. It's completely ridiculous. It's going on all the time.
Question: Why didn't you say it last time? Answer: It's not as important today.
Question: So, you're thinking about the last time I saw it in a theater? Are you kidding? Answer: Not many times in my life. I know it's been here before, but I have no idea if I'd say it again. You could make this movie for a few moments today while the rest of us watch it until the last second. Or, you could do with the entire show (there are no subtitles in it!) and just take it for what it is. Because, I know, I was a bit skeptical with this idea, but I think that maybe it was because I had a lot of questions about the future of movies. So, with that in mind, let me give you the first bit of information that I may tell you, after seeing the movie:
In the movie the girl has never seen the movie before. It is said. The film has been going on for a thousand years. We are told she is a magician. We don't know where she got her magic spell from. So in all respects this is not a fairy tale, so don't believe it, I promise. But, my best guess is she saw it just one time and didn't look into the magical aspects of it. After that, the story ends where the character leaves the show.
So… Well I think she got the spell, and the magic spells were still in place. And, in the end, she says, "I'm all right, I have hope for you" at the end.
[On screen, there, the screen shows a scene with the character standing and taking some of the magic items that were there. They are a "bewildering wand" and "glove spell" that you have taken from an item in the world, and there are some words: "the spell-shepherds-of-hell" and "sorrow-for-a-witch"]
Interviews with Musicians
Input prompt is from a Fader interview with Mitski.
Song Lyrics
Then, like my heart's beacons of light, I pass over to the light that strikes like a golden egg, which I am not quite sure I got before. In the dark they will fly into the garden and not see me.
My footsteps will go through the air like a storm breaking through dust, leaving a cold shiver down my spine. Then I can walk on the road on all four sides, all around you while you are still there.
You look at me, and wonder how little you knew before. The sun shines on me, now, and I am so lost here.
[6]
It is not just now that I am still here, I am still alive, I am still alive.
Lists
I like to build to the point of having a standard set of connections that have only 6 pins attached and that aren't tied together. This is not a problem for the motherboard as no matter what size it is assembled, it's going to be very difficult to get each part down to the point where a single connector would be needed. This is not going to be an issue for your laptop and in fact, I have more serious problems with this when I want to install different applications in multiple locations. This will still work just as well as any other case with the same components. I think most people would be OK with this, but I personally prefer one that has multiple connections where the pins sit on the back. It's also a bit less fun to use if you use two components that can't be connected together for your laptop but it's not a big deal. For example, I really like this laptop as it's one of the lightest laptops out there with a lot less power compared to the larger i3/i5 I've been using.
10. Rollercoaster 2 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100
Loading... Loading...
Quotes are not sourced from all markets and may be delayed up to 20 minutes. Information is provided 'as is' and solely for informational purposes, not for trading purposes or advice.Disclaimer Rolltime Total
A browser error has occurred.
Dialogue
The original intention was to see if the model could complete the comic strip panel.
Sewing Instructions
As you can see from the instructions on this post there will be a few more points after finishing the stitch line. Using tape measure, you have to take apart the top of the ribbing to create the back and the top of the ribbing. This will help keep the stitches very tight. However that isn't the hardest part of sewing or stitching again so the ribbing will get bigger and the lining will get thinner. You'll need to make sure your sewing needle is properly cut to fit your ribbing and to get the fabric to fit the ribbing with your head. When the bottom of the ribbing is sewing the seam across from your neck and the top is sewing the bottom of the ribbing and top of the ribbing through the top of the ribbing and the fabric. If you plan on using the same fabric for another pattern then you'll need to trim off the fabric a bit.
Step 6: Using the tape measure on, sew the fabric around the size of the ribbing. Cut at 2mm around the top of the ribbing to create the back. Cut into about 9 pieces. After cutting out 4 pieces for the ribbing make sure to cut it so your back loops around the ribbing to allow it to come down. Then sew the back of your neck seam through the top of the ribbing to make the fabric. Once the fabric is cut it will sit perfectly to the side and it will cover the top of your neck and neck lines. In the picture you can easily see that this will make the line run out just a little bit and the line will not feel so tight. With no problems you will cut to length on the ribbing again.
Schedules
12:00 AM: Wake up, get ready.
12:00 AM: _____: Yeah, I know. It's so bad, really bad. It's like, we need something better to live at home.
12:30 AM: Wake up, get ready.
I've got some other big stuff to show you, but I want to show you all the fun stuff, and that's actually gonna help you out a lot. But first, let me just get you this idea: what do zombies do all day? What does he do when they've taken over your house? What does he do when someone in his family starts calling you to break the ice?
1:00 PM: He's an archer. That's it.
1:30 PM: He's an ant farmer. That's it.
2:00 PM: He's an ant hunter. That's it.
2:30 PM: He's an ant biker. That's it.
2:30 PM: He's an alpinist. That's it.
3:00 PM: His daddy's in, I don't know who that is.
3:30 PM: He's an ant burglar. That's it.
3:30 PM: He's an ant guard. That's it.
4:00 PM: He's an elk hunter. That's it.
5:00 PM: He's an elk ant farmer. That's it.
5:30 PM: He's an ichthyologist. That's it.
6:00 PM: He's an ant librarian. That's it.
Math
(a = 1)
(c = 2)
(b) 1 = 2
b = 1 (the two numbers of 2)
(c+2=3)
(b=2)
The following two numbers are used to define the number 2, which can be divided to the same value, and given a number (2) as 1:
b = 2
c = 3
(a = 1)
(b = 1)
Completing Mathematical Sequences
Fibonacci sequence
Now, let's go deeper. I had to go deeper to find these things, because I'm not sure about the others before me. First, if you search for "The Last Temptation of Mary" on Google, you can get just about any book, whether you're a scholar or not. If your primary goal is to find "the final Temptation of Mary", then here are some great and valuable books that have made the rounds for me in the last year. First of all, James Baldwin, American Psycho, (1942), (1945), "The Girl Who Cried Alone" (1949), (2005), The Girl Who Sold the Soul (2009), and the best-selling novel of all time, Temptation of Mary . I'll be giving the second book, "God's New Friend: the Life and Art of the Mary Sollom (1942)." I don't have anything but good things to say about how I'm approaching this thing.
Prime numbers
B. Cote, "Torture," 5th ed., p. 31.
C. Crocker, "Evaluation of Torture Under International Law," 5th ed., p. 27 and 6, 9. A. L. Denton, ed., "U.S. Tort Claims Under the Fourth Geneva Convention," The Public Interest Law, vol. 3, No. 5, 1979, pp. 633-644 [p. 3]; and D. C. Wilson, The Torture Statute, 2nd ed., Chicago: University of Chicago Press 1983, p. 14.
D. M. A. Pérez-Cabrera, ed., The Torture Law, 10th ed., Chicago: University of Chicago Press 1982, pp. 12, 13.
Finishing Reductress headlines
Reductress is a satire news website. The original headline reads: “Avoid First-Date Disappointment By Rescheduling Until Never.”
Answering Riddles
We've all heard the same thing about the head, as if it's the one that's out there, but it gets the best of me.
Is your son in the same boat and the head, it's the same? No, that's just a question. Not that I get the same questions.
Jokes
Oh, fuck! Wait, there might one more thing. Look, if they're out there, they've got no chance for help, right?
The evidence suggests these were not just random actions.
The investigators found that at least 30% of men, women and children in the United States are exposed to contaminated water as children, and that their bodies were exposed to such pollution by humans and other species. In addition, the investigators found that human waste that was taken over by birds could cross the road, but that there was no evidence that it could be seen.
So that means if you know something is going on, can you tell the animal from the humans?
You can't. You just can't. And that's why, when you want to get caught, it would be so simple.
True or False Questions
l33t
General Thoughts
Some small changes in the input format can also make a big difference in the output. For example, when given a line of map directions (“Head north on Pearl St toward Willoughby St”), the output did not respond with directions.
But if I number the directions, the output does better resemble directions (“1. Head north on Pearl St toward Willoughby St”).
7/9pm
9:46am
PITTSBURGH
Eating & Drinking, Parking, Parking
9:45am
1st Floor
3rd Floor
Museum
Fresno-Fresno
10 pm – 10:05am
Southeast End
Pasadena, CA 94101
1stfloor
Avenue
Pasadena, TX, 94101
1st Floor – 7:30pm
Park at the new restaurant "Fresno's Tavern" (formerly known as "St Patrick's"). Parking for your table varies:
10AM – 12PM
13 – 1AM
4 – 6PM
5 – 9PM
9am – 1PM
10AM – 5PM
11AM – 4PM
2PM – 8PM
11pm – 4PM
Exhibits & Pubs
E-ZPass with Bluetooth
E-ZPass is now available on the iPhone/iMessage feature in the App Store. You can buy the E-ZPass for $19.49 (iOS 4.3+) and $29.99 (Android 6.0+) for $5. In addition, you can also buy a digital copy of the EZPass here. E-ZPass also allows you to view the app with your iPhone or iPod touch by using an external web browser.
Other features of the E-ZPass include:
Luxrated Mode
Auto-Lux Reviewing
Automatic Review
Automatic Reviews for E-ZPass
E-ZPass for iPhone 6
Free Shipping / Shipping to Puerto Rico
Lipstar Express
Direct Delivery
For orders over $100.00, use the coupon code ROOF for 6 days.
You also get exclusive E-ZPass "Bike" service.
I do wonder what factors go into why the model responds well to some types of input rather than others. How much of it is attributable to the model architecture vs. the training data? How does the input format and structure influence the output?
Training Data
GPT-2 is trained on a 40 GB text dataset called WebText, containing scraped web data that supposedly emphasizes document quality by focusing on web pages that have been curated to some extent by humans. As a heuristic for this, OpenAI scraped all outbound links from Reddit that had at least 3 karma.
However, we must acknowledge the inherent bias to this: Reddit’s user base is mostly white men. Thus, this is the majority of the user base making the judgement of what web texts are considered interesting or good quality. While I understand data curation is a major undertaking, and that heuristics make this task much easier to tackle, it would be inaccurate to state that the WebText dataset is inherently neutral.
Small vs. Full Model
While I didn’t have access to the full GPT-2 model, the Appendix in the paper does provide comparisons of the two.
At first glance, the outputs between the two may look similar. They stick to the same topic and use a similar tone. However, if you read more closely, you might see that the smaller model usually has about one sentence per paragraph that sticks out, is off-topic, or is nonsensical.
While the full model tends to be more coherent and is probably more useful for practical applications, the smaller model contains more “surprises” and is therefore much more fun to play with from a creative standpoint. :-)
Limitations of the Interactive Prompt
For these examples, I stuck to using the interactive prompt interface. Because of this, I wasn’t able to give it text input that contained line breaks without having to edit the code.
I would be interested in modifying the interface so that it can take in line breaks, as I’m curious how the input format and structure influences the generated output text. For example, poetry tends to be multiline, but I would have to smush it into one line in order to feed it into the model. As a result, the output ends up reading like fiction.
Input is the beginning of Mary Oliver’s Wild Geese, concatenated into a single line:
Lena paused. "I'll try to persuade you all to get along well with me." She spoke softly again.
The voice in her head went dark, but she could hear it.
Lena heard her heart pounding, her mind wandering, her face brightening, her blood pumping. The feeling was like a light beating.
"What is despair?" The child whispered.
Lena took a deep breath. It was as if the world had been in touch with her. Slowly but surely.
The light filled the back of her mouth with a wet, blood-red. She saw the tiny kitten in the corner. She felt like the heart of a child holding its baby and crying. She felt the ground beneath her. The cat.
The young girl felt the blood. She felt the warmth of her own skin on her forehead. She felt the warmth of the air in her own heart as she held on to the kitten. She held it until she finally felt it in her chest, then held it up to her face.
It was so warm, her heart was shaking, her legs were shaking. She held the kitten for dear life and loved it so little that she couldn't help thinking about it. She felt it in her heart, she felt it.
She would be forever a stranger in the world of despair in the future.
The dark child slowly, a little deeper in herself, spoke again, softly.
"I love you," she spoke again. "I love you."
One of her hand pressed on the kitten's heart, and she lifted it up and gently released her own body.
It would also be interesting to see how paragraph structure, and white space in general, in the input text influences the generated output. This could apply to inputs such as poetry (not just multiline poems, but also experimental or visual poems), ascii art, movie scripts, transcripts of dialogue, lists, etc. There are so many ways in which whitespace and newlines influence the tone and structure of text, so this could be an interesting area to explore.
Content Filtering
Note that there is no content filtering for this model, and it can and has generated offensive content. I would love to build some fun tools or web apps with this model, but without a content filter, it’s very possible for the generated text to contain serious, controversial, political, and/or triggering content. This is a nontrivial point of consideration for any publicly usable fun!
My Favorite Excerpts
Here are some of my favorite excerpts I’ve read from my explorations :-)
They come from what appears to be an isolated and very secretive sect of the religious world in New York City.
Notorious for their long-standing ties to government corruption and government involvement in criminal affairs, these people have long claimed that they can manipulate the minds of the world's most powerful leaders for the benefit of their own needs.
Great gifts!!! Thank you so much for the great gifts!!! Great gifts!!! Thank you so much for the great gifts!!! Thank you so much for the great gifts!!!
Great gift from my wife!! Great gift from my wife!!! Great gift from my wife!! Great gift from my wife!! THANK YOU!!!!!
Thank you so much for the gift from my husband. My husband loves this gift from our son! I have been in the same room for over 2 years now and love the way it has been wrapped and kept warm inside! Thank you so much!!!
Thank you so much. Thank you so much. Thank you so much for the wonderful gift! I am a huge hugger for these gifts from my husband. I am a huge hugger for these gifts from my husband. I have been in the same room for over 2 years now and love the way it has been wrapped and kept warm inside! Thank you so much!!!!
I love it! Thank you for the great gifts. Great gifts!! Great gifts!! Thank you so much for the wonderful gifts!!!!! Thank you so much for the wonderful gifts!!!!! Thank you so much for the wonderful gifts!!!!! Thank you so much for the wonderful gifts!!! Thank you so much for the wonderful gifts!!! Thank you so much for the wonderful gifts!! Thank you so much for the wonderful gifts!!! Thank you so much for the wonderful gifts!!! Thank you so much for the wonderful gifts!! Thank you so much for the wonderful gifts!!!!! Thank you so much for the lovely gifts!!! Thank you so much for the wonderful gifts!!! Thank you so much for the wonderful gifts!!! Thank you so much for the lovely gifts!! Thank you so much for the great gifts!!
I love this gift. Great gifts. Great gifts. Very happy for the gifts!!!! Thank you for the amazing gifts!!! Thanks!!! I love this gift. Great gifts. Great gifts. Very happy for the gifts!!!! Thank you for the amazing gifts!!! Thanks!!! I love this gift. Great gifts. Great gifts. Very happy for the gifts!!!! Thank you for the amazing gifts!!! Thanks!!! I love this gift. Great gifts. Great gifts. Very happy for the gifts!!!! Thank you for the amazing gifts!!! Thanks!!!
Thank you so much! Thank you for the great gifts!!!!
THANK YOU!!!!!