From jbnaliboff at ucdavis.edu Fri Sep 1 09:48:19 2017
From: jbnaliboff at ucdavis.edu (John Naliboff)
Date: Fri, 1 Sep 2017 09:48:19 -0700
Subject: [aspect-devel] Memory issues when increasing the number of
processors?
In-Reply-To:
References: <03104ae1-594c-2cbe-5e4d-db041a41bc28@ucdavis.edu>
<5931e3a9-4e29-f135-6f21-eefc109eab62@ucdavis.edu>
<345f4c6b-4f8b-8719-3a28-bf95d7558152@ucdavis.edu>
Message-ID: <4E8B3D16-31DC-4EDA-9130-29EA581D364B@ucdavis.edu>
Good news - switching the MPI library from openmpi to mvapich2 seems to have fixed the issue. Three different test cases that previously failed now run without issue. Thanks for the suggestions Timo!
I’ll report back if the system administrator and I are able to pin down exactly what was went wrong with the openmpi library.
For anyone interested in using COMET (http://www.sdsc.edu/support/user_guides/comet.html ), I’ll add installation instructions and scaling results to the github pages in the next week or two. Feel free to send me an email if you would like to get started before then.
FYI, anyone in the University of California system can apply once for a large allocation on COMET (up to 500,000 core hours) with a short (1-2 page) proposal. Any subsequent allocation requests for COMET need to go through the regular XSEDE process, but nonetheless it is a great opportunity.
Cheers,
John
*************************************************
John Naliboff
Assistant Project Scientist, CIG
Earth & Planetary Sciences Dept., UC Davis
> On Aug 31, 2017, at 3:00 PM, Timo Heister wrote:
>
> Another thing that might be happening is some aggressive tuning that is done on the cluster and you run into MPI timeouts (code would run if they had a little more time to respond). Maybe you can ask the sysadmins? Do they have other MPI libraries installed you could try out?
> I find it unlikely that this is a big inside trilinos. Though not impossible of course.
>
> On Aug 31, 2017 17:44, "John Naliboff" > wrote:
> Hello again,
>
> A quick update. I ran scaling tests with dealii step-32 (8, 9 or 10 global refinements) across 192, 384, 768 or 1536 cores.
>
> The same error as reported previously (see attached file) typically occurs with 1536 or 768 cores, although in some cases models which previously crashed are able to run without issue. Still no real correlation between d.o.f per core and the point (number of cores) when the model crashes.
>
> Is it worth reporting this issue to the Trilinos mailing list?
>
> Cheers,
> John
> *************************************************
> John Naliboff
> Assistant Project Scientist, CIG
> Earth & Planetary Sciences Dept., UC Davis
> On 08/24/2017 03:46 PM, John Naliboff wrote:
>> Hi all,
>>
>> Below are messages I accidentally only sent to Timo rather than the whole mailing list.
>>
>> Timo - I tried Trilinos 12.10.1 and this did not resolve the issue. I'm going to try and reproduce the issue with step-32 and/or a different cluster next.
>>
>> Cheers,
>> John
>> *************************************************
>> John Naliboff
>> Assistant Project Scientist, CIG
>> Earth & Planetary Sciences Dept., UC Davis
>> On 08/23/2017 02:56 PM, Timo Heister wrote:
>>> John,
>>>
>>>> /home/jboff/software/trilinos/trilinos-12.4.2/install/lib/libml.so.12(ML_Comm_Send+0x20)[0x2ba25c648cc0]
>>> So the this is inside the multigrid preconditioner from Trilinos. One
>>> option might be to try a newer Trilinos release. Sorry, I know that is
>>> annoying.
>>>
>>>> /home/jboff/software/aspect/master/aspect/./aspect(_ZN6aspect18FreeSurfaceHandlerILi3EE26compute_mesh_displacementsEv+0x55c)
>>> You are using free surface computations. This is something that we
>>> haven't tested as much. Do you also get crashes without free surface
>>> computations?
>>>
>>>
>>>
>>>
>>> On Wed, Aug 23, 2017 at 5:40 PM, John Naliboff wrote:
>>>> Hi Timo,
>>>>
>>>> Thanks for the feedback. I tried a few more tests with a different model
>>>> (lithospheric deformation) and encountered similar issues. The attached
>>>> error output provides a bit more info this time. The model was run across
>>>> 768 cores.
>>>>
>>>> From the output it looks like there is an issue in Epetra?
>>>>
>>>> Perhaps unrelated, but I am using Lapack 3.6.0 and had change some of the
>>>> symbols labels in packages/epetra/src/Epetra_LAPACK_wrappers.h (e.g.
>>>> following https://urldefense.proofpoint.com/v2/url?u=https-3A__www.dealii.org_8.5.0_external-2Dlibs_trilinos.html&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=-52w3PZcChqYNoEqEX3EQFxw7h1LxbshOiW84lYSJPw&m=KIi4Fm0ctZaNWUSlxyOz9H31YTD1MWZdXrS-M99zGsg&s=Zfr_gHqG_d9nwcISyOmx_EvIkKibJSW3iQ73AgsoKGg&e= ).
>>>>
>>>> Cheers,
>>>> John
>>>>
>>>> *************************************************
>>>> John Naliboff
>>>> Assistant Project Scientist, CIG
>>>> Earth & Planetary Sciences Dept., UC Davis
>>>>
>>>> On 08/23/2017 09:41 AM, Timo Heister wrote:
>>>>
>>>> John,
>>>>
>>>> it would be neat to have a longer callstack to see where this error is
>>>> happening.
>>>>
>>>> Some ideas:
>>>> 1. This could be a hardware issue (one of the nodes can not
>>>> communicate, has packet loss or whatever).
>>>> 2. This could be a configuration problem ("too many retries sending
>>>> message to 0x5a90:0x000639a2, giving up" could mean some MPI timeouts
>>>> are triggered)
>>>> 3. It could be a bug in some MPI code (in Trilinos, deal.II, or
>>>> ASPECT). A longer callstack would help narrow that down.
>>>>
>>>> If you feel like experimenting, you could see if you can trigger the
>>>> same issue with deal.II step-32.
>>>>
>>>>
>>>> On Tue, Aug 22, 2017 at 4:44 PM, John Naliboff
>>>> wrote:
>>>>
>>>> Hi all,
>>>>
>>>> I'm looking for feedback on a memory error(s?) that has me somewhat
>>>> perplexed.
>>>>
>>>> The errors are occurring on the XSEDE cluster Comet:
>>>>
>>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__www.sdsc.edu_support_user-5Fguides_comet.html&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=NBYyCSgTMal5JLEHkh0Zeox-XPuHC_Vt1oaKkep6Dto&s=njB2yysRd9W3aP6qJFTQspUXkAjXMAXtMUNshyD4XXY&e=
>>>>
>>>> The models in question are a series of scaling tests following the tests run
>>>> by Rene Gassmoeller:
>>>>
>>>> https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_gassmoeller_aspect-2Dperformance-2Dstatistics&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=NBYyCSgTMal5JLEHkh0Zeox-XPuHC_Vt1oaKkep6Dto&s=mchFc6MAAp6-OtcloaqidEJvZDnCljy0ZfO7CSIoS70&e=
>>>>
>>>> When using up to 192 processors and global refinement levels of 2, 3, 4 or 5
>>>> the scaling results are "roughly" (not too far off) what I would expect
>>>> based on Rene's results.
>>>>
>>>> However, once I get up to 384 cores the models almost always crash with a
>>>> segmentation fault error. Here is part of the error message from a model run
>>>> on 384 cores with 4 global refinement levels.
>>>> Number of active cells: 393,216 (on 5 levels)
>>>> Number of degrees of freedom: 16,380,620
>>>> (9,585,030+405,570+3,195,010+3,195,010)
>>>>
>>>> *** Timestep 0: t=0 years
>>>> Solving temperature system... 0 iterations.
>>>> Solving C_1 system ... 0 iterations.
>>>> Rebuilding Stokes preconditioner...[comet-06-22:09703] *** Process
>>>> received signal ***
>>>> [comet-06-22:09703] Signal: Segmentation fault (11)
>>>>
>>>> The full model output is locate in the attached file.
>>>>
>>>> Thoughts on what might be causing a memory issue when increasing the number
>>>> of cores?
>>>>
>>>> The perplexing part is that the error does not seemed to be tied to the
>>>> number of d.o.f. per processor. Also somewhat perplexing is one model that
>>>> crashed with this error was able to run successfully using the exact same
>>>> submission script, input file, etc. However, this only happened once
>>>> (successfully running failed job) and the errors are almost reproducible.
>>>>
>>>> If no one has encountered this issue before, any suggestions for debugging
>>>> tricks with this number of processors? I may be able to run an interactive
>>>> session in debug mode with this number of processors, but I would need to
>>>> check with the cluster administrator.
>>>>
>>>> Thanks!
>>>> John
>>>>
>>>> --
>>>>
>>>> *************************************************
>>>> John Naliboff
>>>> Assistant Project Scientist, CIG
>>>> Earth & Planetary Sciences Dept., UC Davis
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> Aspect-devel mailing list
>>>> Aspect-devel at geodynamics.org
>>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=-I0hdEXVrD9Y9ctQZz7W8BKwf95g8vGE23968nhbZp0&s=14IchaNUIlvQctS6F-2NYva_crTRZeowj2JwiNg0-oU&e=
>>>>
>>>>
>>>>
>>>
>>
>
>
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=9W8eIhCtynBg06oSOs_KI7g2XSNBIsK_pw8JF8yPndI&s=DaWzdQrNfzNDgwXAiFb76XN4cSZ_eIAVFwzzo527aYw&e=
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From ljhwang at ucdavis.edu Fri Sep 1 10:03:48 2017
From: ljhwang at ucdavis.edu (Lorraine Hwang)
Date: Fri, 1 Sep 2017 10:03:48 -0700
Subject: [aspect-devel] Memory issues when increasing the number of
processors?
In-Reply-To: <4E8B3D16-31DC-4EDA-9130-29EA581D364B@ucdavis.edu>
References: <03104ae1-594c-2cbe-5e4d-db041a41bc28@ucdavis.edu>
<5931e3a9-4e29-f135-6f21-eefc109eab62@ucdavis.edu>
<345f4c6b-4f8b-8719-3a28-bf95d7558152@ucdavis.edu>
<4E8B3D16-31DC-4EDA-9130-29EA581D364B@ucdavis.edu>
Message-ID:
John,
We should add scaling tests to our web pages as well:
https://geodynamics.org/cig/dev/xsede-resources/code-scaling/
Best,
-Lorraine
*****************************
Lorraine Hwang, Ph.D.
Associate Director, CIG
530.752.3656
> On Sep 1, 2017, at 9:48 AM, John Naliboff wrote:
>
> Good news - switching the MPI library from openmpi to mvapich2 seems to have fixed the issue. Three different test cases that previously failed now run without issue. Thanks for the suggestions Timo!
>
> I’ll report back if the system administrator and I are able to pin down exactly what was went wrong with the openmpi library.
>
> For anyone interested in using COMET (http://www.sdsc.edu/support/user_guides/comet.html ), I’ll add installation instructions and scaling results to the github pages in the next week or two. Feel free to send me an email if you would like to get started before then.
>
> FYI, anyone in the University of California system can apply once for a large allocation on COMET (up to 500,000 core hours) with a short (1-2 page) proposal. Any subsequent allocation requests for COMET need to go through the regular XSEDE process, but nonetheless it is a great opportunity.
>
> Cheers,
> John
>
> *************************************************
> John Naliboff
> Assistant Project Scientist, CIG
> Earth & Planetary Sciences Dept., UC Davis
>
>
>
>
>
>
>> On Aug 31, 2017, at 3:00 PM, Timo Heister > wrote:
>>
>> Another thing that might be happening is some aggressive tuning that is done on the cluster and you run into MPI timeouts (code would run if they had a little more time to respond). Maybe you can ask the sysadmins? Do they have other MPI libraries installed you could try out?
>> I find it unlikely that this is a big inside trilinos. Though not impossible of course.
>>
>> On Aug 31, 2017 17:44, "John Naliboff" > wrote:
>> Hello again,
>>
>> A quick update. I ran scaling tests with dealii step-32 (8, 9 or 10 global refinements) across 192, 384, 768 or 1536 cores.
>>
>> The same error as reported previously (see attached file) typically occurs with 1536 or 768 cores, although in some cases models which previously crashed are able to run without issue. Still no real correlation between d.o.f per core and the point (number of cores) when the model crashes.
>>
>> Is it worth reporting this issue to the Trilinos mailing list?
>>
>> Cheers,
>> John
>> *************************************************
>> John Naliboff
>> Assistant Project Scientist, CIG
>> Earth & Planetary Sciences Dept., UC Davis
>> On 08/24/2017 03:46 PM, John Naliboff wrote:
>>> Hi all,
>>>
>>> Below are messages I accidentally only sent to Timo rather than the whole mailing list.
>>>
>>> Timo - I tried Trilinos 12.10.1 and this did not resolve the issue. I'm going to try and reproduce the issue with step-32 and/or a different cluster next.
>>>
>>> Cheers,
>>> John
>>> *************************************************
>>> John Naliboff
>>> Assistant Project Scientist, CIG
>>> Earth & Planetary Sciences Dept., UC Davis
>>> On 08/23/2017 02:56 PM, Timo Heister wrote:
>>>> John,
>>>>
>>>>> /home/jboff/software/trilinos/trilinos-12.4.2/install/lib/libml.so.12(ML_Comm_Send+0x20)[0x2ba25c648cc0]
>>>> So the this is inside the multigrid preconditioner from Trilinos. One
>>>> option might be to try a newer Trilinos release. Sorry, I know that is
>>>> annoying.
>>>>
>>>>> /home/jboff/software/aspect/master/aspect/./aspect(_ZN6aspect18FreeSurfaceHandlerILi3EE26compute_mesh_displacementsEv+0x55c)
>>>> You are using free surface computations. This is something that we
>>>> haven't tested as much. Do you also get crashes without free surface
>>>> computations?
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, Aug 23, 2017 at 5:40 PM, John Naliboff wrote:
>>>>> Hi Timo,
>>>>>
>>>>> Thanks for the feedback. I tried a few more tests with a different model
>>>>> (lithospheric deformation) and encountered similar issues. The attached
>>>>> error output provides a bit more info this time. The model was run across
>>>>> 768 cores.
>>>>>
>>>>> From the output it looks like there is an issue in Epetra?
>>>>>
>>>>> Perhaps unrelated, but I am using Lapack 3.6.0 and had change some of the
>>>>> symbols labels in packages/epetra/src/Epetra_LAPACK_wrappers.h (e.g.
>>>>> following https://urldefense.proofpoint.com/v2/url?u=https-3A__www.dealii.org_8.5.0_external-2Dlibs_trilinos.html&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=-52w3PZcChqYNoEqEX3EQFxw7h1LxbshOiW84lYSJPw&m=KIi4Fm0ctZaNWUSlxyOz9H31YTD1MWZdXrS-M99zGsg&s=Zfr_gHqG_d9nwcISyOmx_EvIkKibJSW3iQ73AgsoKGg&e= ).
>>>>>
>>>>> Cheers,
>>>>> John
>>>>>
>>>>> *************************************************
>>>>> John Naliboff
>>>>> Assistant Project Scientist, CIG
>>>>> Earth & Planetary Sciences Dept., UC Davis
>>>>>
>>>>> On 08/23/2017 09:41 AM, Timo Heister wrote:
>>>>>
>>>>> John,
>>>>>
>>>>> it would be neat to have a longer callstack to see where this error is
>>>>> happening.
>>>>>
>>>>> Some ideas:
>>>>> 1. This could be a hardware issue (one of the nodes can not
>>>>> communicate, has packet loss or whatever).
>>>>> 2. This could be a configuration problem ("too many retries sending
>>>>> message to 0x5a90:0x000639a2, giving up" could mean some MPI timeouts
>>>>> are triggered)
>>>>> 3. It could be a bug in some MPI code (in Trilinos, deal.II, or
>>>>> ASPECT). A longer callstack would help narrow that down.
>>>>>
>>>>> If you feel like experimenting, you could see if you can trigger the
>>>>> same issue with deal.II step-32.
>>>>>
>>>>>
>>>>> On Tue, Aug 22, 2017 at 4:44 PM, John Naliboff
>>>>> wrote:
>>>>>
>>>>> Hi all,
>>>>>
>>>>> I'm looking for feedback on a memory error(s?) that has me somewhat
>>>>> perplexed.
>>>>>
>>>>> The errors are occurring on the XSEDE cluster Comet:
>>>>>
>>>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__www.sdsc.edu_support_user-5Fguides_comet.html&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=NBYyCSgTMal5JLEHkh0Zeox-XPuHC_Vt1oaKkep6Dto&s=njB2yysRd9W3aP6qJFTQspUXkAjXMAXtMUNshyD4XXY&e=
>>>>>
>>>>> The models in question are a series of scaling tests following the tests run
>>>>> by Rene Gassmoeller:
>>>>>
>>>>> https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_gassmoeller_aspect-2Dperformance-2Dstatistics&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=NBYyCSgTMal5JLEHkh0Zeox-XPuHC_Vt1oaKkep6Dto&s=mchFc6MAAp6-OtcloaqidEJvZDnCljy0ZfO7CSIoS70&e=
>>>>>
>>>>> When using up to 192 processors and global refinement levels of 2, 3, 4 or 5
>>>>> the scaling results are "roughly" (not too far off) what I would expect
>>>>> based on Rene's results.
>>>>>
>>>>> However, once I get up to 384 cores the models almost always crash with a
>>>>> segmentation fault error. Here is part of the error message from a model run
>>>>> on 384 cores with 4 global refinement levels.
>>>>> Number of active cells: 393,216 (on 5 levels)
>>>>> Number of degrees of freedom: 16,380,620
>>>>> (9,585,030+405,570+3,195,010+3,195,010)
>>>>>
>>>>> *** Timestep 0: t=0 years
>>>>> Solving temperature system... 0 iterations.
>>>>> Solving C_1 system ... 0 iterations.
>>>>> Rebuilding Stokes preconditioner...[comet-06-22:09703] *** Process
>>>>> received signal ***
>>>>> [comet-06-22:09703] Signal: Segmentation fault (11)
>>>>>
>>>>> The full model output is locate in the attached file.
>>>>>
>>>>> Thoughts on what might be causing a memory issue when increasing the number
>>>>> of cores?
>>>>>
>>>>> The perplexing part is that the error does not seemed to be tied to the
>>>>> number of d.o.f. per processor. Also somewhat perplexing is one model that
>>>>> crashed with this error was able to run successfully using the exact same
>>>>> submission script, input file, etc. However, this only happened once
>>>>> (successfully running failed job) and the errors are almost reproducible.
>>>>>
>>>>> If no one has encountered this issue before, any suggestions for debugging
>>>>> tricks with this number of processors? I may be able to run an interactive
>>>>> session in debug mode with this number of processors, but I would need to
>>>>> check with the cluster administrator.
>>>>>
>>>>> Thanks!
>>>>> John
>>>>>
>>>>> --
>>>>>
>>>>> *************************************************
>>>>> John Naliboff
>>>>> Assistant Project Scientist, CIG
>>>>> Earth & Planetary Sciences Dept., UC Davis
>>>>>
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> Aspect-devel mailing list
>>>>> Aspect-devel at geodynamics.org
>>>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=-I0hdEXVrD9Y9ctQZz7W8BKwf95g8vGE23968nhbZp0&s=14IchaNUIlvQctS6F-2NYva_crTRZeowj2JwiNg0-oU&e=
>>>>>
>>>>>
>>>>>
>>>
>>
>>
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=9W8eIhCtynBg06oSOs_KI7g2XSNBIsK_pw8JF8yPndI&s=DaWzdQrNfzNDgwXAiFb76XN4cSZ_eIAVFwzzo527aYw&e=
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From jperryh2 at uoregon.edu Tue Sep 5 16:00:58 2017
From: jperryh2 at uoregon.edu (Jonathan Perry-Houts)
Date: Tue, 5 Sep 2017 16:00:58 -0700
Subject: [aspect-devel] Weird pressure with periodic BC's + direct
solver?
In-Reply-To: <4904da4a-e175-5bba-b91c-0596dd252ca6@uoregon.edu>
References: <4904da4a-e175-5bba-b91c-0596dd252ca6@uoregon.edu>
Message-ID:
Sorry for the rapid-fire email spam here. Quick follow-up: The periodic
BC doesn't affect this, it's just the direct solver. I must be missing
something obvious. Is that normal for the direct solver?
I attached a simple prm that reproduces this (comment out the Use direct
solver... option to make it work correctly).
Thanks again,
-JPH
On 09/05/2017 03:49 PM, Jonathan Perry-Houts wrote:
> Hi everyone,
>
> I just noticed that a bunch of models I've run recently have strange
> pressure fields (see attached). It only happens with a periodic boundary
> condition, and the direct solver enabled at the same time. Any other
> combination of one or the other setting doesn't do this.
>
> The velocity fields look correct (as compared to models with direct
> solver turned off), so *hopefully* this is somehow just a
> post-processing problem. Anyone have insight here?
>
> Thanks!
> -JPH
>
-------------- next part --------------
# Run with ASPECT commit: 47ea2623c5fda44f9365b808aea717b86a4ce8e6
#
set Dimension = 2
set Pressure normalization = surface
set Use direct solver for Stokes system = true
subsection Geometry model
set Model name = box
subsection Box
set X extent = 4146902.30274
set Y extent = 660000.0
set X periodic = false #true
set Y periodic = false
set X repetitions = 6
set Y repetitions = 1
end
end
subsection Model settings
set Fixed temperature boundary indicators = bottom, top
set Tangential velocity boundary indicators = top
set Zero velocity boundary indicators = bottom
set Remove nullspace = net x translation
# set Zero velocity boundary indicators = top, bottom, left, right
end
subsection Compositional fields
set Number of fields = 1
set Names of fields = lithosphere
end
subsection Material model
#set Material averaging = arithmetic average
set Material averaging = pick largest
set Model name = simple
subsection Simple model
set Density differential for compositional field 1 = 200
end
end
subsection Initial composition model
set Model name = function
subsection Function
set Variable names = x,z,t
set Function constants = pi=3.14159, y=0
set Function expression = 0.5*(1-tanh((660000.0 - z - 61500 - 4100*cos(5 * x *2*pi/4146902.30274)) / 4100))
end
end
subsection Initial temperature model
set Model name = function
subsection Function
set Function expression = 0
end
end
subsection Boundary temperature model
set Model name = initial temperature
end
subsection Boundary composition model
set Model name = initial composition
end
subsection Gravity model
set Model name = vertical
subsection Vertical
set Magnitude = 9.8
end
end
subsection Mesh refinement
set Initial global refinement = 5 # 6
set Initial adaptive refinement = 2
set Minimum refinement level = 3
set Strategy = composition, minimum refinement function
subsection Minimum refinement function
set Coordinate system = cartesian
set Variable names = x,y
set Function expression = if(y>539148.433754&y<588288.662723,8,0)
end
end
subsection Postprocess
set List of postprocessors = visualization, point values
subsection Visualization
set List of output variables = compositional vector, material properties
set Time between graphical output = 2e6
set Output format = vtu
# set Interpolate output = true
subsection Material properties
set List of material properties = density, viscosity
end
end
end
subsection Termination criteria
set Checkpoint on termination = false
set Termination criteria = end step
set End step = 0
end
From jperryh2 at uoregon.edu Tue Sep 5 15:49:19 2017
From: jperryh2 at uoregon.edu (Jonathan Perry-Houts)
Date: Tue, 5 Sep 2017 15:49:19 -0700
Subject: [aspect-devel] Weird pressure with periodic BC's + direct solver?
Message-ID: <4904da4a-e175-5bba-b91c-0596dd252ca6@uoregon.edu>
Hi everyone,
I just noticed that a bunch of models I've run recently have strange
pressure fields (see attached). It only happens with a periodic boundary
condition, and the direct solver enabled at the same time. Any other
combination of one or the other setting doesn't do this.
The velocity fields look correct (as compared to models with direct
solver turned off), so *hopefully* this is somehow just a
post-processing problem. Anyone have insight here?
Thanks!
-JPH
--
Jonathan Perry-Houts
Ph.D. Candidate
Department of Earth Sciences
1272 University of Oregon
Eugene, OR 97403-1272
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Screenshot from 2017-09-05 15-39-31.png
Type: image/png
Size: 21463 bytes
Desc: not available
URL:
From bangerth at colostate.edu Tue Sep 5 18:10:35 2017
From: bangerth at colostate.edu (Wolfgang Bangerth)
Date: Tue, 5 Sep 2017 19:10:35 -0600
Subject: [aspect-devel] Weird pressure with periodic BC's + direct
solver?
In-Reply-To: <4904da4a-e175-5bba-b91c-0596dd252ca6@uoregon.edu>
References: <4904da4a-e175-5bba-b91c-0596dd252ca6@uoregon.edu>
Message-ID: <969b66a1-3113-a132-1bcb-4e94bcb36996@colostate.edu>
Jonathan,
> I just noticed that a bunch of models I've run recently have strange
> pressure fields (see attached). It only happens with a periodic boundary
> condition, and the direct solver enabled at the same time. Any other
> combination of one or the other setting doesn't do this.
>
> The velocity fields look correct (as compared to models with direct
> solver turned off), so *hopefully* this is somehow just a
> post-processing problem. Anyone have insight here?
It looks like the problems are all located along specific lines. Is this where
processor boundaries lie? Or hanging nodes? Can you produce the same picture
with the mesh overlaid?
I'm wondering whether we properly call ConstraintMatrix::distribute() on the
solution vector after calling the direct solver. This would explain artifacts
at hanging nodes.
Best
W.
--
------------------------------------------------------------------------
Wolfgang Bangerth email: bangerth at colostate.edu
www: http://www.math.colostate.edu/~bangerth/
From hlokavarapu at ucdavis.edu Tue Sep 5 16:54:44 2017
From: hlokavarapu at ucdavis.edu (Harsha Lokavarapu)
Date: Tue, 5 Sep 2017 16:54:44 -0700
Subject: [aspect-devel] Advice for optimizing particle advection for 2D
Spherical Shell Models
Message-ID:
Hi,
As part of a subgroup in Davis, we have been actively studying active
particles when we began to notice that under certain specific situations
the scaling of our models degraded. This occurs only for 2D spherical
annulus domains. The culprit was the particle advection step. Further
investigation showed that the bottleneck within the advection step lay in
the update to the reference location. As we increase the number of
particles per cell for a fixed grid refinement and fixed MPI processes, we
noticed a linear growth in the wall clock time of
transform_from_real_to_unit_cell(). Sometimes, we were looking at 60-70% of
the wall clock time spent in this portion of the code. After several
discussions, we settled on a possible solution. However, attempts to
implement the solution have proven more difficult that expected.
To reproduce the bottleneck, I have attached a modified annulus.prm
benchmark as the test case. This will need to be linked to the compiled
additional library of aspect/benchmarks/annulus. I have added timers to the
local particle advection function to measure the different bits and pieces.
Feel free to compile the version of the code branch at
https://github.com/hlokavarapu/aspect/tree/time_particle_advection
Assuming that the working directory is aspect/benchmarks/annulus , for
bash users, run
$ for i in 2 4 8; do sed -i "s/set Number of particles per cell per
direction.*/set Number of particles per cell per direction = $i/"
annulus.prm; mpirun -np 1 aspect annulus.prm >> advect.log ;done
The results will be in the file advect.log.
Anyways, as a starting point, we decided to pick the simplest case, namely
the Euler integration scheme to test our idea. Currently the physical
location of the particle is updated using Euler. Similarly, we would like
to introduce a second Euler step to update the reference location of the
particle. However, this requires the velocity solution at the support
points in the reference cell. If we are using a Q2 velocity element, we can
get away with 9 transformations.
After, burrowing through the Deal.II documentation, I have found
implemented functionality for transforming vectors from unit to real but
not the other way around. Is there a black box solution that already exists
within Deal.II which converts vectors from real to unit?
We also looked through the implementation of transform_from_real_to_unit_cell()
which uses a newton iteration to transform a point in physical space to a
point in the unit space. Will a similar technique be necessary to transform
a vector from real to unit?
Also, are there any locations within Aspect that involves working with the
velocity in the reference cell?
For a box domain, the transform_from_real_to_unit is orders faster than in
the spherical shell domain. Is this because we have a better initial guess
to the newton iteration?
We (I) have been stumped on this for close to a week now. Any leads are
highly appreciated.
Thank You,
Harsha
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
-------------- next part --------------
A non-text attachment was scrubbed...
Name: annulus.prm
Type: application/octet-stream
Size: 3933 bytes
Desc: not available
URL:
From lev.karatun at gmail.com Tue Sep 5 22:32:58 2017
From: lev.karatun at gmail.com (Lev Karatun)
Date: Wed, 6 Sep 2017 01:32:58 -0400
Subject: [aspect-devel] =?utf-8?b?ZXJyb3I6IOKAmERFQUxfSUlfRkFMTFRIUk9VR0g=?=
=?utf-8?q?=E2=80=99_was_not_declared_in_this_scope?=
Message-ID:
Hi,
I'm getting an error trying to update Aspect:
[ 0%] Building CXX object
> CMakeFiles/aspect.dir/source/simulator/free_surface.cc.o
> In file included from
> /home/lev/aspect/dealii_debug/include/deal.II/base/timer.h:23:0,
> from
> /home/lev/aspect/aspect_test/include/aspect/simulator.h:25,
> from
> /home/lev/aspect/aspect_test/source/simulator/free_surface.cc:22:
> /home/lev/aspect/dealii_debug/include/deal.II/base/utilities.h: In
> function ‘Iterator dealii::Utilities::lower_bound(Iterator, Iterator, const
> T&, Comp)’:
> /home/lev/aspect/dealii_debug/include/deal.II/base/utilities.h:670:17:
> error: ‘DEAL_II_FALLTHROUGH’ was not declared in this scope
> DEAL_II_FALLTHROUGH;
> ^
> In file included from
> /home/lev/aspect/dealii_debug/include/deal.II/base/tensor.h:22:0,
> from
> /home/lev/aspect/dealii_debug/include/deal.II/base/point.h:22,
> from
> /home/lev/aspect/dealii_debug/include/deal.II/base/patterns.h:23,
> from
> /home/lev/aspect/dealii_debug/include/deal.II/base/parameter_handler.h:23,
> from
> /home/lev/aspect/aspect_test/include/aspect/simulator.h:26,
> from
> /home/lev/aspect/aspect_test/source/simulator/free_surface.cc:22:
> /home/lev/aspect/dealii_debug/include/deal.II/base/table_indices.h: In
> constructor ‘dealii::TableIndices::TableIndices(unsigned int, unsigned
> int, unsigned int, unsigned int, unsigned int, unsigned int, unsigned int,
> unsigned int, unsigned int)’:
> /home/lev/aspect/dealii_debug/include/deal.II/base/table_indices.h:279:7:
> error: ‘DEAL_II_FALLTHROUGH’ was not declared in this scope
> DEAL_II_FALLTHROUGH;
> ^
> /home/lev/aspect/dealii_debug/include/deal.II/base/table_indices.h:314:7:
> error: ‘DEAL_II_FALLTHROUGH’ was not declared in this scope
> DEAL_II_FALLTHROUGH;
> ^
> make[2]: *** [CMakeFiles/aspect.dir/source/simulator/free_surface.cc.o]
> Error 1
> make[1]: *** [CMakeFiles/aspect.dir/all] Error 2
> make: *** [all] Error 2
Not sure if it's relevant, but I got the following error when trying to run
"make install" for dealII (while updating it to the latest version), fixed
by running it as root (not sure if it's a good enough workaround).
Install the project...
> -- Install configuration: "DebugRelease"
> -- Up-to-date: /usr/local/share/deal.II/scripts/normalize.pl
> CMake Error at cmake/scripts/cmake_install.cmake:36 (FILE):
> file INSTALL cannot set permissions on
> "/usr/local/share/deal.II/scripts/normalize.pl"
> Call Stack (most recent call first):
> cmake_install.cmake:37 (INCLUDE)
> make: *** [install] Error 1
Any help will be appreciated.
Best regards,
Lev Karatun.
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From heister at clemson.edu Wed Sep 6 04:57:36 2017
From: heister at clemson.edu (Timo Heister)
Date: Wed, 6 Sep 2017 07:57:36 -0400
Subject: [aspect-devel]
=?utf-8?b?ZXJyb3I6IOKAmERFQUxfSUlfRkFMTFRIUk9VR0g=?=
=?utf-8?q?=E2=80=99_was_not_declared_in_this_scope?=
In-Reply-To:
References:
Message-ID:
Lev,
> I'm getting an error trying to update Aspect:
>
>> /home/lev/aspect/dealii_debug/include/deal.II/base/utilities.h:670:17:
>> error: ‘DEAL_II_FALLTHROUGH’ was not declared in this scope
>> DEAL_II_FALLTHROUGH;
Did you update deal.II? If yes, did you reconfigure (cmake), compile,
and install deal.II after updating?
If not, can you post your ASPECT and deal.II detailed.log files?
> Not sure if it's relevant, but I got the following error when trying to run
> "make install" for dealII (while updating it to the latest version), fixed
> by running it as root (not sure if it's a good enough workaround).
No, that is typically not the right thing to do. You did not set the
install path (-D CMAKE_INSTALL_PREFIX=) for deal.II to something you
have write access to. What is the deal.II path that you point ASPECT
to?
--
Timo Heister
http://www.math.clemson.edu/~heister/
From lev.karatun at gmail.com Wed Sep 6 07:28:04 2017
From: lev.karatun at gmail.com (Lev Karatun)
Date: Wed, 6 Sep 2017 10:28:04 -0400
Subject: [aspect-devel]
=?utf-8?b?ZXJyb3I6IOKAmERFQUxfSUlfRkFMTFRIUk9VR0g=?=
=?utf-8?q?=E2=80=99_was_not_declared_in_this_scope?=
In-Reply-To:
References:
Message-ID:
Hi Timo,
>>Did you update deal.II? If yes, did you reconfigure (cmake), compile, and
install deal.II after updating?
Yes, I did all of it.
>>What is the deal.II path that you point ASPECT to?
It's set to -DCMAKE_INSTALL_PREFIX:PATH=/home/lev/aspect/dealii_debug/
Pretty sure I do have write access there since it's within my home
directory.
Best regards,
Lev Karatun.
2017-09-06 7:57 GMT-04:00 Timo Heister :
> Lev,
>
> > I'm getting an error trying to update Aspect:
> >
> >> /home/lev/aspect/dealii_debug/include/deal.II/base/utilities.h:670:17:
> >> error: ‘DEAL_II_FALLTHROUGH’ was not declared in this scope
> >> DEAL_II_FALLTHROUGH;
>
> Did you update deal.II? If yes, did you reconfigure (cmake), compile,
> and install deal.II after updating?
>
> If not, can you post your ASPECT and deal.II detailed.log files?
>
> > Not sure if it's relevant, but I got the following error when trying to
> run
> > "make install" for dealII (while updating it to the latest version),
> fixed
> > by running it as root (not sure if it's a good enough workaround).
>
> No, that is typically not the right thing to do. You did not set the
> install path (-D CMAKE_INSTALL_PREFIX=) for deal.II to something you
> have write access to. What is the deal.II path that you point ASPECT
> to?
>
> --
> Timo Heister
> http://www.math.clemson.edu/~heister/
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From heister at clemson.edu Wed Sep 6 09:26:52 2017
From: heister at clemson.edu (Timo Heister)
Date: Wed, 6 Sep 2017 12:26:52 -0400
Subject: [aspect-devel]
=?utf-8?b?ZXJyb3I6IOKAmERFQUxfSUlfRkFMTFRIUk9VR0g=?=
=?utf-8?q?=E2=80=99_was_not_declared_in_this_scope?=
In-Reply-To:
References:
Message-ID:
>>>What is the deal.II path that you point ASPECT to?
> It's set to -DCMAKE_INSTALL_PREFIX:PATH=/home/lev/aspect/dealii_debug/
> Pretty sure I do have write access there since it's within my home
> directory.
that is not true, because it is trying to install to /usr/local:
> file INSTALL cannot set permissions on
> "/usr/local/share/deal.II/scripts/normalize.pl"
Take a look at the detailed.log in the deal.II build directory. It
should spell out the CMAKE_INSTALL_PREFIX at the top of the file.
--
Timo Heister
http://www.math.clemson.edu/~heister/
From rene.gassmoeller at mailbox.org Wed Sep 6 09:31:45 2017
From: rene.gassmoeller at mailbox.org (Rene Gassmoeller)
Date: Wed, 6 Sep 2017 18:31:45 +0200
Subject: [aspect-devel] Advice for optimizing particle advection for 2D
Spherical Shell Models
In-Reply-To:
References:
Message-ID:
Hi Harsha,
I would dispute that 'scaling' is the right word for this problem. Since
your solution time scales linearly with the number of particles this is
an optimal scaling (since each particle produces the same effort). But
you are right, computing the velocity of each particle (and in
particular computing the location in the reference cell for higher order
mappings / curved geometries) is a major part of the whole particle cost.
Unfortunately, I do not think there is an easy way around that, you
either need to compute the particle's location in the reference cell, or
need to project the real velocity into the coordinate system of the
reference cell (which is at least as expensive, and introduces another
source of error). You are right that transforming the velocity at the
reference points might save you some time, but I am not yet convinced
that this would also save time for higher order integration schemes like
RK2 where you would need to project the old and the new velocities.
Additionally, you would still need to recompute the reference location
when the particle moves to another cell. The other options to speed up
this process are equally unattractive: You could use a first order
mapping for the cells to make the inversion faster (and vastly decrease
the accuracy of the Stokes solution near the boundaries), you could use
Q1P0 elements that at least make the evaluation of the velocity faster
once you have the position in the reference cell (thereby picking up a
lot of velocity divergence at the particle positions), or you could
rewrite the organization and whole particle algorithm in a
mesh-independent way, which is a major effort. During writing that
algorithm I spend quite some time thinking about the same problems, but
did not come up with a better solution so far. If you find one, feel
free to let me know, I would appreciate a speed-up in that place a lot ;-)
I annotated some of your questions below to the best of my knowledge.
Best,
Rene
On 09/06/2017 01:54 AM, Harsha Lokavarapu wrote:
>
> Anyways, as a starting point, we decided to pick the simplest case,
> namely the Euler integration scheme to test our idea. Currently the
> physical location of the particle is updated using Euler. Similarly,
> we would like to introduce a second Euler step to update the reference
> location of the particle. However, this requires the velocity solution
> at the support points in the reference cell. If we are using a Q2
> velocity element, we can get away with 9 transformations.
True, but what about higher order integration schemes?
>
> After, burrowing through the Deal.II documentation, I have found
> implemented functionality for transforming vectors from unit to real
> but not the other way around. Is there a black box solution that
> already exists within Deal.II which converts vectors from real to unit?
>
Sorry, I do not know one, but there still might be one.
> We also looked through the implementation of
> transform_from_real_to_unit_cell() which uses a newton iteration to
> transform a point in physical space to a point in the unit space. Will
> a similar technique be necessary to transform a vector from real to unit?
>
I would guess so.
> Also, are there any locations within Aspect that involves working with
> the velocity in the reference cell?
>
No, so far there are no locations I am aware of.
> For a box domain, the transform_from_real_to_unit is orders faster
> than in the spherical shell domain. Is this because we have a better
> initial guess to the newton iteration?
>
Probably, but also because the nonlinearity is much less strong. As far
as I remember for an undeformed box we do not even need a Newton
iteration, because an analytical expression is available for
MappingEulerian (after all it is just a scaling and movement of the
reference cell, no deformation).
> We (I) have been stumped on this for close to a week now. Any leads
> are highly appreciated.
>
I have spent a significant amount of time on the same problems two years
ago, I also appreciate any leads :-)
> Thank You,
> Harsha
>
>
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
--
Rene Gassmoeller
http://www.math.colostate.edu/~gassmoel/
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From heister at clemson.edu Wed Sep 6 09:49:31 2017
From: heister at clemson.edu (Timo Heister)
Date: Wed, 6 Sep 2017 12:49:31 -0400
Subject: [aspect-devel] Advice for optimizing particle advection for 2D
Spherical Shell Models
In-Reply-To:
References:
Message-ID:
>> After, burrowing through the Deal.II documentation, I have found implemented
>> functionality for transforming vectors from unit to real but not the other
>> way around. Is there a black box solution that already exists within Deal.II
>> which converts vectors from real to unit?
>
> Sorry, I do not know one, but there still might be one.
Check out the transform_unit_to_real_cell, etc. functions inside the
mapping: https://urldefense.proofpoint.com/v2/url?u=https-3A__www.dealii.org_8.5.0_doxygen_deal.II_classMapping.html-23ae5df63553eb8ed170c3b90524853dd48&d=DwIBaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=IZUo2SgLQlkV3Ss2KUMK7zRlp7SDmcPwREjqwYm17sI&s=8ZTnjxyGgaBWdjHqSf3VUQLljCeVbb6zvPsU2vtz5mE&e=
>> We also looked through the implementation of
>> transform_from_real_to_unit_cell() which uses a newton iteration to
>> transform a point in physical space to a point in the unit space. Will a
>> similar technique be necessary to transform a vector from real to unit?
>
> I would guess so.
It of course depends on the mapping you use, but the direction
unit->real is typically much cheaper as it is an evaluation of a
polynomial inside the mapping. With a MappingManifold, both directions
should be equally cheap (we are basically just mapping from spherical
to cartesian coordinates or the other way around). Maybe it makes
sense to use MappingManifold if you have many particles?
Best,
Timo
--
Timo Heister
http://www.math.clemson.edu/~heister/
From heister at clemson.edu Wed Sep 6 10:00:31 2017
From: heister at clemson.edu (Timo Heister)
Date: Wed, 6 Sep 2017 13:00:31 -0400
Subject: [aspect-devel] Advice for optimizing particle advection for 2D
Spherical Shell Models
In-Reply-To:
References:
Message-ID:
> polynomial inside the mapping. With a MappingManifold, both directions
> should be equally cheap (we are basically just mapping from spherical
> to cartesian coordinates or the other way around). Maybe it makes
> sense to use MappingManifold if you have many particles?
It turns out that the function is not implemented in MappingManifold
but there is an issue about it here:
https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_dealii_dealii_issues_3673&d=DwIBaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=uux0CTNIjOFNf1xGQVdpIEUxHWHY0gKLmWOOHUfYH1s&s=1LNJ_M41ueDL_Cx5aObVogJ_DMl0t96YobmK0_rxwS4&e=
This should be easy enough to implement but it requires some testing.
--
Timo Heister
http://www.math.clemson.edu/~heister/
From lev.karatun at gmail.com Wed Sep 6 11:32:24 2017
From: lev.karatun at gmail.com (Lev Karatun)
Date: Wed, 6 Sep 2017 14:32:24 -0400
Subject: [aspect-devel]
=?utf-8?b?ZXJyb3I6IOKAmERFQUxfSUlfRkFMTFRIUk9VR0g=?=
=?utf-8?q?=E2=80=99_was_not_declared_in_this_scope?=
In-Reply-To:
References:
Message-ID:
Hi Timo,
thanks you, I recompiled dealII once again and Aspect installed
successfully. Not sure how the /etc path got in the compilation parameters
-- I never had it there and haven't changed the cmake string for a long
time.
Anyway, thank you for your help!
Best regards,
Lev Karatun.
2017-09-06 12:26 GMT-04:00 Timo Heister :
> >>>What is the deal.II path that you point ASPECT to?
> > It's set to -DCMAKE_INSTALL_PREFIX:PATH=/home/lev/aspect/dealii_debug/
> > Pretty sure I do have write access there since it's within my home
> > directory.
>
> that is not true, because it is trying to install to /usr/local:
>
> > file INSTALL cannot set permissions on
> > "/usr/local/share/deal.II/scripts/normalize.pl"
>
> Take a look at the detailed.log in the deal.II build directory. It
> should spell out the CMAKE_INSTALL_PREFIX at the top of the file.
>
>
> --
> Timo Heister
> http://www.math.clemson.edu/~heister/
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From heister at clemson.edu Wed Sep 6 13:03:03 2017
From: heister at clemson.edu (Timo Heister)
Date: Wed, 6 Sep 2017 16:03:03 -0400
Subject: [aspect-devel]
=?utf-8?b?ZXJyb3I6IOKAmERFQUxfSUlfRkFMTFRIUk9VR0g=?=
=?utf-8?q?=E2=80=99_was_not_declared_in_this_scope?=
In-Reply-To:
References:
Message-ID:
> successfully. Not sure how the /etc path got in the compilation parameters
/usr/local/ is the default if you don't specify anything else.
--
Timo Heister
http://www.math.clemson.edu/~heister/
From jperryh2 at uoregon.edu Wed Sep 6 22:38:03 2017
From: jperryh2 at uoregon.edu (Jonathan Perry-Houts)
Date: Wed, 6 Sep 2017 22:38:03 -0700
Subject: [aspect-devel] Weird pressure with periodic BC's + direct
solver?
In-Reply-To: <969b66a1-3113-a132-1bcb-4e94bcb36996@colostate.edu>
References: <4904da4a-e175-5bba-b91c-0596dd252ca6@uoregon.edu>
<969b66a1-3113-a132-1bcb-4e94bcb36996@colostate.edu>
Message-ID: <636b37c7-26c4-c10a-28a5-9b015c6496c1@uoregon.edu>
Thanks for the suggestion Wolfgang! It does indeed only show up at
hanging nodes (figure attached). And it apparently only happens with
both the direct solver, and 'remove null space' enabled at the same time
(also attached).
When those two options are both enabled the pressure solution is zero
everywhere, but gets these little (~ 1e-6) jitters at hanging nodes.
Definitely sounds like a ConstraintsMatrix problem.
On 09/05/2017 06:10 PM, Wolfgang Bangerth wrote:
>
> Jonathan,
>
>> I just noticed that a bunch of models I've run recently have strange
>> pressure fields (see attached). It only happens with a periodic boundary
>> condition, and the direct solver enabled at the same time. Any other
>> combination of one or the other setting doesn't do this.
>>
>> The velocity fields look correct (as compared to models with direct
>> solver turned off), so *hopefully* this is somehow just a
>> post-processing problem. Anyone have insight here?
>
> It looks like the problems are all located along specific lines. Is this
> where processor boundaries lie? Or hanging nodes? Can you produce the
> same picture with the mesh overlaid?
>
> I'm wondering whether we properly call ConstraintMatrix::distribute() on
> the solution vector after calling the direct solver. This would explain
> artifacts at hanging nodes.
>
> Best
> W.
>
--
Jonathan Perry-Houts
Ph.D. Candidate
Department of Earth Sciences
1272 University of Oregon
Eugene, OR 97403-1272
-------------- next part --------------
A non-text attachment was scrubbed...
Name: direct solver pressure problems - closeup.png
Type: image/png
Size: 37577 bytes
Desc: not available
URL:
From philip.j.heron at durham.ac.uk Mon Sep 11 03:26:31 2017
From: philip.j.heron at durham.ac.uk (HERON, PHILIP J.)
Date: Mon, 11 Sep 2017 10:26:31 +0000
Subject: [aspect-devel] Strain weaking - continental extension model
Message-ID:
Hello there,
I've been asked to give an informal chat to some structural geologists on ASPECT usability this week, so I'm just building up some simple models.
I've been playing around with the continental extension model and am looking to add on strain weakening to the cookbook example. However, each time I've tried to implement it I get a strange density inversion in the model.
Does anyone have an example of how to implement simple strain weakening, say for this cookbook example? There are no examples kicking around that I can see.
Thanks in advance for the help!
Best,
Phil
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From bangerth at colostate.edu Tue Sep 12 16:38:32 2017
From: bangerth at colostate.edu (Wolfgang Bangerth)
Date: Tue, 12 Sep 2017 17:38:32 -0600
Subject: [aspect-devel] Weird pressure with periodic BC's + direct
solver?
In-Reply-To: <636b37c7-26c4-c10a-28a5-9b015c6496c1@uoregon.edu>
References: <4904da4a-e175-5bba-b91c-0596dd252ca6@uoregon.edu>
<969b66a1-3113-a132-1bcb-4e94bcb36996@colostate.edu>
<636b37c7-26c4-c10a-28a5-9b015c6496c1@uoregon.edu>
Message-ID:
On 09/06/2017 11:38 PM, Jonathan Perry-Houts wrote:
> Thanks for the suggestion Wolfgang! It does indeed only show up at
> hanging nodes (figure attached). And it apparently only happens with
> both the direct solver, and 'remove null space' enabled at the same time
> (also attached).
>
> When those two options are both enabled the pressure solution is zero
> everywhere, but gets these little (~ 1e-6) jitters at hanging nodes.
> Definitely sounds like a ConstraintsMatrix problem.
Can you compare the direct solver/iterative solver code paths in
solver.cc to see whether we may accidentally forget to call something
like constraints.distribute(...) on the solution vector if we use the
direct solver?
Best
W.
--
------------------------------------------------------------------------
Wolfgang Bangerth email: bangerth at colostate.edu
www: http://www.math.colostate.edu/~bangerth/
From heister at clemson.edu Wed Sep 13 05:56:52 2017
From: heister at clemson.edu (Timo Heister)
Date: Wed, 13 Sep 2017 08:56:52 -0400
Subject: [aspect-devel] Weird pressure with periodic BC's + direct
solver?
In-Reply-To:
References: <4904da4a-e175-5bba-b91c-0596dd252ca6@uoregon.edu>
<969b66a1-3113-a132-1bcb-4e94bcb36996@colostate.edu>
<636b37c7-26c4-c10a-28a5-9b015c6496c1@uoregon.edu>
Message-ID:
Jonathan,
I am getting this error when running your prm:
--------------------------------------------------------
An error occurred in line <244> of file
in function
void aspect::Simulator::remove_net_linear_momentum(bool,
aspect::LinearAlgebra::BlockVector&,
aspect::LinearAlgebra::BlockVector&) [with int dim = 2,
aspect::LinearAlgebra::BlockVector =
dealii::TrilinosWrappers::MPI::BlockVector]
The violated condition was:
introspection.block_indices.velocities !=
introspection.block_indices.pressure
Additional information:
You are trying to use functionality in deal.II that is currently
not implemented. In many cases, this indicates that there simply
didn't appear much of a need for it, or that the author of the
original code did not have the time to implement a particular case. If
you hit this exception, it is therefore worth the time to look into
the code to find out whether you may be able to implement the missing
functionality. If you do, please consider providing a patch to the
deal.II development sources (see the deal.II website on how to
contribute).
Stacktrace:
-----------
#0 ./build/aspect:
aspect::Simulator<2>::remove_net_linear_momentum(bool,
dealii::TrilinosWrappers::MPI::BlockVector&,
dealii::TrilinosWrappers::MPI::BlockVector&)
#1 ./build/aspect:
aspect::Simulator<2>::remove_nullspace(dealii::TrilinosWrappers::MPI::BlockVector&,
dealii::TrilinosWrappers::MPI::BlockVector&)
#2 ./build/aspect: aspect::Simulator<2>::solve_stokes()
#3 ./build/aspect: aspect::Simulator<2>::solve_timestep()
#4 ./build/aspect: aspect::Simulator<2>::run()
#5 ./build/aspect: main
--------------------------------------------------------
On Tue, Sep 12, 2017 at 7:38 PM, Wolfgang Bangerth
wrote:
> On 09/06/2017 11:38 PM, Jonathan Perry-Houts wrote:
>>
>> Thanks for the suggestion Wolfgang! It does indeed only show up at hanging
>> nodes (figure attached). And it apparently only happens with both the direct
>> solver, and 'remove null space' enabled at the same time (also attached).
>>
>> When those two options are both enabled the pressure solution is zero
>> everywhere, but gets these little (~ 1e-6) jitters at hanging nodes.
>> Definitely sounds like a ConstraintsMatrix problem.
>
>
> Can you compare the direct solver/iterative solver code paths in solver.cc
> to see whether we may accidentally forget to call something like
> constraints.distribute(...) on the solution vector if we use the direct
> solver?
>
> Best
> W.
>
> --
> ------------------------------------------------------------------------
> Wolfgang Bangerth email: bangerth at colostate.edu
> www:
> https://urldefense.proofpoint.com/v2/url?u=http-3A__www.math.colostate.edu_-7Ebangerth_&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=1CPipUrsDEc8vNcA_TcedJ0x0o2_P13lFm64C-FBXT0&s=ZgGNuNrQpnnpoawynmntW-GtvrVw3NaWZP2N39W1MlY&e=
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=1CPipUrsDEc8vNcA_TcedJ0x0o2_P13lFm64C-FBXT0&s=fmCU8C8oOe_MbSjnU2Lx7z2gzkynBVc9srkTeV5LN6s&e=
--
Timo Heister
http://www.math.clemson.edu/~heister/
From jperryh2 at uoregon.edu Wed Sep 13 12:16:54 2017
From: jperryh2 at uoregon.edu (Jonathan Perry-Houts)
Date: Wed, 13 Sep 2017 12:16:54 -0700
Subject: [aspect-devel] Weird pressure with periodic BC's + direct
solver?
In-Reply-To:
References: <4904da4a-e175-5bba-b91c-0596dd252ca6@uoregon.edu>
<969b66a1-3113-a132-1bcb-4e94bcb36996@colostate.edu>
<636b37c7-26c4-c10a-28a5-9b015c6496c1@uoregon.edu>
Message-ID: <18d1b711-8fcc-c1cc-6363-a789ada321bb@uoregon.edu>
Ahh. That explains it. I was running in Release mode so I never saw that
error.
Should the Assert at source/simulator/nullspace.cc:243 be an AssertThrow
instead?
On 09/13/2017 05:56 AM, Timo Heister wrote:
> Jonathan,
>
> I am getting this error when running your prm:
> --------------------------------------------------------
> An error occurred in line <244> of file
> in function
> void aspect::Simulator::remove_net_linear_momentum(bool,
> aspect::LinearAlgebra::BlockVector&,
> aspect::LinearAlgebra::BlockVector&) [with int dim = 2,
> aspect::LinearAlgebra::BlockVector =
> dealii::TrilinosWrappers::MPI::BlockVector]
> The violated condition was:
> introspection.block_indices.velocities !=
> introspection.block_indices.pressure
> Additional information:
> You are trying to use functionality in deal.II that is currently
> not implemented. In many cases, this indicates that there simply
> didn't appear much of a need for it, or that the author of the
> original code did not have the time to implement a particular case. If
> you hit this exception, it is therefore worth the time to look into
> the code to find out whether you may be able to implement the missing
> functionality. If you do, please consider providing a patch to the
> deal.II development sources (see the deal.II website on how to
> contribute).
>
> Stacktrace:
> -----------
> #0 ./build/aspect:
> aspect::Simulator<2>::remove_net_linear_momentum(bool,
> dealii::TrilinosWrappers::MPI::BlockVector&,
> dealii::TrilinosWrappers::MPI::BlockVector&)
> #1 ./build/aspect:
> aspect::Simulator<2>::remove_nullspace(dealii::TrilinosWrappers::MPI::BlockVector&,
> dealii::TrilinosWrappers::MPI::BlockVector&)
> #2 ./build/aspect: aspect::Simulator<2>::solve_stokes()
> #3 ./build/aspect: aspect::Simulator<2>::solve_timestep()
> #4 ./build/aspect: aspect::Simulator<2>::run()
> #5 ./build/aspect: main
> --------------------------------------------------------
>
> On Tue, Sep 12, 2017 at 7:38 PM, Wolfgang Bangerth
> wrote:
>> On 09/06/2017 11:38 PM, Jonathan Perry-Houts wrote:
>>>
>>> Thanks for the suggestion Wolfgang! It does indeed only show up at hanging
>>> nodes (figure attached). And it apparently only happens with both the direct
>>> solver, and 'remove null space' enabled at the same time (also attached).
>>>
>>> When those two options are both enabled the pressure solution is zero
>>> everywhere, but gets these little (~ 1e-6) jitters at hanging nodes.
>>> Definitely sounds like a ConstraintsMatrix problem.
>>
>>
>> Can you compare the direct solver/iterative solver code paths in solver.cc
>> to see whether we may accidentally forget to call something like
>> constraints.distribute(...) on the solution vector if we use the direct
>> solver?
>>
>> Best
>> W.
>>
>> --
>> ------------------------------------------------------------------------
>> Wolfgang Bangerth email: bangerth at colostate.edu
>> www:
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__www.math.colostate.edu_-7Ebangerth_&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=1CPipUrsDEc8vNcA_TcedJ0x0o2_P13lFm64C-FBXT0&s=ZgGNuNrQpnnpoawynmntW-GtvrVw3NaWZP2N39W1MlY&e=
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=1CPipUrsDEc8vNcA_TcedJ0x0o2_P13lFm64C-FBXT0&s=fmCU8C8oOe_MbSjnU2Lx7z2gzkynBVc9srkTeV5LN6s&e=
>
>
>
--
Jonathan Perry-Houts
Ph.D. Candidate
Department of Earth Sciences
1272 University of Oregon
Eugene, OR 97403-1272
From A.C.Glerum at uu.nl Tue Sep 19 01:29:52 2017
From: A.C.Glerum at uu.nl (Glerum, A.C. (Anne))
Date: Tue, 19 Sep 2017 08:29:52 +0000
Subject: [aspect-devel] Strain weaking - continental extension model
In-Reply-To:
References:
Message-ID:
Hi Phil,
Hope this answer is not too late for your talk, but there are some tests using the strain weakening functionality of the visco plastic material model:
visco_plastic_complex.prm
visco_plastic_yield_strain_weakening.prm
visco_plastic_yield_strain_weakening_full_strain_tensor.prm
Perhaps they’re helpful? I’ve been running some extension models with strain weakening, so I’m curious to hear whether you still get your density problem.
Cheers,
Anne
On 11 Sep 2017, at 12:26, HERON, PHILIP J. > wrote:
Hello there,
I've been asked to give an informal chat to some structural geologists on ASPECT usability this week, so I'm just building up some simple models.
I've been playing around with the continental extension model and am looking to add on strain weakening to the cookbook example. However, each time I've tried to implement it I get a strange density inversion in the model.
Does anyone have an example of how to implement simple strain weakening, say for this cookbook example? There are no examples kicking around that I can see.
Thanks in advance for the help!
Best,
Phil
_______________________________________________
Aspect-devel mailing list
Aspect-devel at geodynamics.org
http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From philip.j.heron at durham.ac.uk Tue Sep 19 01:39:55 2017
From: philip.j.heron at durham.ac.uk (HERON, PHILIP J.)
Date: Tue, 19 Sep 2017 08:39:55 +0000
Subject: [aspect-devel] Strain weaking - continental extension model
In-Reply-To:
References: ,
Message-ID:
Hi Anne,
Thanks for the email! These are great - thank you. John Naliboff emailed me a while back - looking back over the email it looks like we had an exchange between ourselves and not the group! Thanks for the help!
It worked out well - no density inversion. I was implementing the strain weakening incorrectly.
Thanks!
Phil
________________________________
From: Aspect-devel on behalf of Glerum, A.C. (Anne)
Sent: 19 September 2017 09:29:52
To: aspect-devel at geodynamics.org
Subject: Re: [aspect-devel] Strain weaking - continental extension model
Hi Phil,
Hope this answer is not too late for your talk, but there are some tests using the strain weakening functionality of the visco plastic material model:
visco_plastic_complex.prm
visco_plastic_yield_strain_weakening.prm
visco_plastic_yield_strain_weakening_full_strain_tensor.prm
Perhaps they’re helpful? I’ve been running some extension models with strain weakening, so I’m curious to hear whether you still get your density problem.
Cheers,
Anne
On 11 Sep 2017, at 12:26, HERON, PHILIP J. > wrote:
Hello there,
I've been asked to give an informal chat to some structural geologists on ASPECT usability this week, so I'm just building up some simple models.
I've been playing around with the continental extension model and am looking to add on strain weakening to the cookbook example. However, each time I've tried to implement it I get a strange density inversion in the model.
Does anyone have an example of how to implement simple strain weakening, say for this cookbook example? There are no examples kicking around that I can see.
Thanks in advance for the help!
Best,
Phil
_______________________________________________
Aspect-devel mailing list
Aspect-devel at geodynamics.org
http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From A.C.Glerum at uu.nl Tue Sep 19 03:46:52 2017
From: A.C.Glerum at uu.nl (Glerum, A.C. (Anne))
Date: Tue, 19 Sep 2017 10:46:52 +0000
Subject: [aspect-devel] Strain weaking - continental extension model
In-Reply-To:
References:
Message-ID:
Hi Phil,
Glad to hear it worked out, thanks!
Cheers,
Anne
On 19 Sep 2017, at 10:39, HERON, PHILIP J. > wrote:
Hi Anne,
Thanks for the email! These are great - thank you. John Naliboff emailed me a while back - looking back over the email it looks like we had an exchange between ourselves and not the group! Thanks for the help!
It worked out well - no density inversion. I was implementing the strain weakening incorrectly.
Thanks!
Phil
________________________________
From: Aspect-devel > on behalf of Glerum, A.C. (Anne) >
Sent: 19 September 2017 09:29:52
To: aspect-devel at geodynamics.org
Subject: Re: [aspect-devel] Strain weaking - continental extension model
Hi Phil,
Hope this answer is not too late for your talk, but there are some tests using the strain weakening functionality of the visco plastic material model:
visco_plastic_complex.prm
visco_plastic_yield_strain_weakening.prm
visco_plastic_yield_strain_weakening_full_strain_tensor.prm
Perhaps they’re helpful? I’ve been running some extension models with strain weakening, so I’m curious to hear whether you still get your density problem.
Cheers,
Anne
On 11 Sep 2017, at 12:26, HERON, PHILIP J. > wrote:
Hello there,
I've been asked to give an informal chat to some structural geologists on ASPECT usability this week, so I'm just building up some simple models.
I've been playing around with the continental extension model and am looking to add on strain weakening to the cookbook example. However, each time I've tried to implement it I get a strange density inversion in the model.
Does anyone have an example of how to implement simple strain weakening, say for this cookbook example? There are no examples kicking around that I can see.
Thanks in advance for the help!
Best,
Phil
_______________________________________________
Aspect-devel mailing list
Aspect-devel at geodynamics.org
http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
_______________________________________________
Aspect-devel mailing list
Aspect-devel at geodynamics.org
http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From rene.gassmoeller at mailbox.org Tue Sep 19 03:00:14 2017
From: rene.gassmoeller at mailbox.org (Rene Gassmoeller)
Date: Tue, 19 Sep 2017 10:00:14 -0000
Subject: [aspect-devel] ASPECT Newsletter #39
Message-ID: <20170919175630.1522CAC21FD@geodynamics.org>
Hello everyone!
This is ASPECT newsletter #39.
It automatically reports recently merged features and discussions about the ASPECT mantle convection code.
## Below you find a list of recently proposed or merged features:
#1928: Change nullspace removal Assert to AssertThrow (proposed by jperryhouts; merged) https://github.com/geodynamics/aspect/pull/1928
#1927: make melt and boundary traction work together (proposed by jdannberg; merged) https://github.com/geodynamics/aspect/pull/1927
#1926: fix escape sequence in string (proposed by tjhei; merged) https://github.com/geodynamics/aspect/pull/1926
#1925: Fix grammar of a function's documentation. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/1925
#1923: Improve wording of explanation in simple material model. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/1923
#1922: Fix visco plastic material model formulas and description. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/1922
#1921: Fix simple material model formulas. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/1921
#1920: Fix nondimensional material model formula. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/1920
#1919: Fix latent heat description and formulas. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/1919
#1918: Fix Morency-Doin description and formulas. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/1918
#1916: fix for porosity initial composition plugin (proposed by jdannberg) https://github.com/geodynamics/aspect/pull/1916
#1915: monthly maintenance (proposed by tjhei; merged) https://github.com/geodynamics/aspect/pull/1915
#1914: Print 0 if solver_control returns an invalid unsigned integer for last_step (proposed by naliboff; merged) https://github.com/geodynamics/aspect/pull/1914
#1910: make the cell object input argument in the constructor of material mo... (proposed by jdannberg; merged) https://github.com/geodynamics/aspect/pull/1910
#1906: Change from std_cxx11::array to Tensor<1,dim> for cartesi... (proposed by MFraters; merged) https://github.com/geodynamics/aspect/pull/1906
## And this is a list of recently opened or closed discussions:
#1924: update convection box cookbook (opened) https://github.com/geodynamics/aspect/issues/1924
#1917: how to set temperature pre-factor for the viscosity formula in 'simple' Material model (opened and closed) https://github.com/geodynamics/aspect/issues/1917
#1542: Support different timescales (closed) https://github.com/geodynamics/aspect/issues/1542
#1491: [Poll] Does precompiling headers reduce your compile time? (closed) https://github.com/geodynamics/aspect/issues/1491
#1084: add installation instructions for astyle (closed) https://github.com/geodynamics/aspect/issues/1084
#1064: Better advice for referencing (closed) https://github.com/geodynamics/aspect/issues/1064
#684: Mention Github on ASPECT's website more prominently (closed) https://github.com/geodynamics/aspect/issues/684
A list of all major changes since the last release can be found at https://aspect.dealii.org/doc/doxygen/changes_current.html.
Thanks for being part of the community!
Let us know about questions, problems, bugs or just share your experience by writing to aspect-devel at geodynamics.org, or by opening issues or pull requests at https://www.github.com/geodynamics/aspect.
Additional information can be found at https://aspect.dealii.org/, and https://geodynamics.org/cig/software/aspect/.
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
From jbnaliboff at ucdavis.edu Tue Sep 19 08:04:21 2017
From: jbnaliboff at ucdavis.edu (John Naliboff)
Date: Tue, 19 Sep 2017 08:04:21 -0700
Subject: [aspect-devel] Strain weaking - continental extension model
In-Reply-To:
References:
Message-ID: <21FF5DBB-955C-4624-8BD8-D9B9DAB72876@ucdavis.edu>
Hi all,
Apologies, I did not release I had only responded to Phil when I replied to his email! The message and files I sent are located below.
Cheers,
John
*************************************************
John Naliboff
Assistant Project Scientist, CIG
Earth & Planetary Sciences Dept., UC Davis
Hi Phil,
That's odd you are getting a density inversion. I've attached a modified version of the continental extension cookbook with strain weakening, which uses the strain invariant (e.g., not full strain tensor) to track strain.
The option to use the full strain tensor exists, but there is no need to use it unless you want to track finite strain through time. In that case, the best option is probably to track the finite strain tensor with particles and use a compositional field for finite strain.
There is an option to output the modified friction and cohesion angles, with examples located in the tests folder.
Please let me know if you have any questions about the attached parameter file, additional options or if any errors pop up with the parameter file.
Cheers,
John
> On Sep 19, 2017, at 1:39 AM, HERON, PHILIP J. wrote:
>
> Hi Anne,
>
> Thanks for the email! These are great - thank you. John Naliboff emailed me a while back - looking back over the email it looks like we had an exchange between ourselves and not the group! Thanks for the help!
>
> It worked out well - no density inversion. I was implementing the strain weakening incorrectly.
>
> Thanks!
>
> Phil
> From: Aspect-devel on behalf of Glerum, A.C. (Anne)
> Sent: 19 September 2017 09:29:52
> To: aspect-devel at geodynamics.org
> Subject: Re: [aspect-devel] Strain weaking - continental extension model
>
> Hi Phil,
>
> Hope this answer is not too late for your talk, but there are some tests using the strain weakening functionality of the visco plastic material model:
> visco_plastic_complex.prm
> visco_plastic_yield_strain_weakening.prm
> visco_plastic_yield_strain_weakening_full_strain_tensor.prm
> Perhaps they’re helpful? I’ve been running some extension models with strain weakening, so I’m curious to hear whether you still get your density problem.
>
> Cheers,
> Anne
>
>
>> On 11 Sep 2017, at 12:26, HERON, PHILIP J. > wrote:
>>
>> Hello there,
>>
>> I've been asked to give an informal chat to some structural geologists on ASPECT usability this week, so I'm just building up some simple models.
>>
>> I've been playing around with the continental extension model and am looking to add on strain weakening to the cookbook example. However, each time I've tried to implement it I get a strange density inversion in the model.
>>
>> Does anyone have an example of how to implement simple strain weakening, say for this cookbook example? There are no examples kicking around that I can see.
>>
>>
>>
>> Thanks in advance for the help!
>>
>> Best,
>>
>> Phil
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
-------------- next part --------------
A non-text attachment was scrubbed...
Name: continental_extension_strainweakening.prm
Type: application/octet-stream
Size: 11329 bytes
Desc: not available
URL:
-------------- next part --------------
An HTML attachment was scrubbed...
URL: