Need help with understanding how LocalMapdlPool works. #3287
-
The main objective what I am trying to achieve is to do parallel computing to improve speed of execution as much as possible. Currently with just one instance of launch_mapdl, my code is able to do run one execution of a load case with an average of 5.67s. But the issue is that there is more than 10300 load cases to run and with current speed it would take nearly a day to complete. What I initially thought was to split up the number of load case into two and have one instance running from 1 to the half and then the second to run from half to the end. Is this the best way to do this? And also with using LocalMapdlPool? Any help would be much appreciated.
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Hi @adeebsaitBH The latest documentation of MapdlPool is in Create a pool of MAPDL instances. I recommend you to have a look at it. MapdlPool is a object that allows you to manage multiple MAPDL instances from a single object. It also support queuing, using the MapdlPool.map and run_batch. It seems to me that you want to do something similar to what is done in here: Run an user function Since the funcion you attached seems to run through all the input files in a folder (I am guessing), you might want to do something like: # assuming that 10300 load cases are in here
load_cases_dir = r"path/to/load/cases"
load_cases_files = os.listdir(load_cases_dir)
# Create pool
from ansys.mapdl.core import MapdlPool
pool = MapdlPool(4)
MAX_RETRY = 3
start_case = 0
for ind, file in enumerate(load_cases_files[start_case:]):
print(f"Running {ind} case. File: {file}")
solve_with_retry(file)
def solve_with_retry(file):
for attempt in range(MAX_RETRY):
try:
result = solve_in_pool(file)
except:
# Failed. Probably the MAPDL instance died.
# MapdlPool should relaunch the instance, lets continue in
# another instance
continue
else:
break
else:
# we failed all the attempts - deal with the consequences.
raise Exception(f"Exceed maximum number of retries for file {file}")
def solve_in_pool(file):
with pool.next() as mapdl:
return solve_one_case(mapdl, file)
def solve_one_case(mapdl, file):
with mapdl.non_interactive:
mapdl.run("/clear,start")
###
# reading load case
mapdl.input(file)
###
mapdl.input(save_dir + '\\' + "Mlokin", "inp")
mapdl.run("sscurve = stress")
mapdl.input(save_dir + '\\' + "Mlokrun", "inp")
mapdl.input(server_dir + '\\' + "mat", "mac")
... |
Beta Was this translation helpful? Give feedback.
Hi @adeebsaitBH
The latest documentation of MapdlPool is in Create a pool of MAPDL instances. I recommend you to have a look at it.
MapdlPool is a object that allows you to manage multiple MAPDL instances from a single object. It also support queuing, using the MapdlPool.map and run_batch.
It seems to me that you want to do something similar to what is done in here: Run an user function
Since the funcion you attached seems to run through all the input files in a folder (I am guessing), you might want to do something like: