Spiral Simulation Running out of GPU Memory #519
Unanswered
nathanjbeaumont
asked this question in
Q&A
Replies: 3 comments 10 replies
-
Beta Was this translation helpful? Give feedback.
5 replies
-
No they said that their is a net moment of 8pi at the end of the sequence
along the slice direction.
You must rephase everything along read and phase and just put an
additional spoiler at the end of the TR along the slice dimension of 8pi
area.
If you put the de phasing gradient of 8pi after the slice selection you will obtain something like a PSIF sequence I think.
|
Beta Was this translation helpful? Give feedback.
1 reply
-
@cncastillo I would like to use an adiabatic pulse for the inversion at the beginning as that's what the paper does, but I don't know how to design such a pulse. I see |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I have a spiral sequence with 48 interleaves I'm trying to simulate with the brain_phantom2D. To get an image that doesn't have the screen door looking artifact I need to set
us=2
inbrain_phantom2D()
. However if I do this even with only 48 interleaves in the sequence my GPU runs out of memory. I have an RTX 2070 16gigs of GPU memory, (8 dedicated, 8 shared). Run tests.jl in the zip below to see:tests.zip
I'm able to run it on CPU but the recon looks very bad for a fully sampled k-space:
Beta Was this translation helpful? Give feedback.
All reactions