Date   

Incompatible fuse ops ioctl signature with git cloned fuse installation

xuning97@...
 

from the config, FUSE_USE_VERSION is defined to 32

        reqs.define('fuse', libs=['fuse3'], defines=["FUSE_USE_VERSION=32"],
        headers=['fuse3/fuse.h'], package='fuse3-devel') 
from git fuse src, when this version is defined under 35, the first ioctl signature will be used, here the cmd type is int



from daos srce code, "unsigned int cmd" is used, which is not compatible with the first definition, so the compilation will fail. After comment out "unsigned", everything is compiled successfully.


Re: Dfuse_hl and Interception library

Colin Ngam
 

Never mind .. I see it get set when building shared .. I am just trying to see how the Macros get expanded.

 

From: <daos@daos.groups.io> on behalf of Colin Ngam <colin.ngam@...>
Reply-To: "daos@daos.groups.io" <daos@daos.groups.io>
Date: Wednesday, June 3, 2020 at 9:07 PM
To: "daos@daos.groups.io" <daos@daos.groups.io>
Subject: Re: [daos] Dfuse_hl and Interception library

 

Hi,

 

How do I turn on IOIL_PRELOAD for the build?

 

Thanks.

 

Colin

 

From: <daos@daos.groups.io> on behalf of Colin Ngam <colin.ngam@...>
Reply-To: "daos@daos.groups.io" <daos@daos.groups.io>
Date: Wednesday, June 3, 2020 at 4:30 PM
To: "daos@daos.groups.io" <daos@daos.groups.io>
Subject: [daos] Dfuse_hl and Interception library

 

Greetings,

 

Just a couple quick questions:

  1. What is the difference between “aliased intercept” and “single intercept”?
  2. What’s the purpose of __wrap_XXX definition?

Example:
__attribute__((visibility("default"))) int __wrap_fclose (FILE *) __attribute__((weak, alias("dfuse_" "fclose")));

 

Thanks.

 

Colin

 


Re: Dfuse_hl and Interception library

Colin Ngam
 

Hi,

 

How do I turn on IOIL_PRELOAD for the build?

 

Thanks.

 

Colin

 

From: <daos@daos.groups.io> on behalf of Colin Ngam <colin.ngam@...>
Reply-To: "daos@daos.groups.io" <daos@daos.groups.io>
Date: Wednesday, June 3, 2020 at 4:30 PM
To: "daos@daos.groups.io" <daos@daos.groups.io>
Subject: [daos] Dfuse_hl and Interception library

 

Greetings,

 

Just a couple quick questions:

  1. What is the difference between “aliased intercept” and “single intercept”?
  2. What’s the purpose of __wrap_XXX definition?

Example:
__attribute__((visibility("default"))) int __wrap_fclose (FILE *) __attribute__((weak, alias("dfuse_" "fclose")));

 

Thanks.

 

Colin

 


Dfuse_hl and Interception library

Colin Ngam
 

Greetings,

 

Just a couple quick questions:

  1. What is the difference between “aliased intercept” and “single intercept”?
  2. What’s the purpose of __wrap_XXX definition?

Example:
__attribute__((visibility("default"))) int __wrap_fclose (FILE *) __attribute__((weak, alias("dfuse_" "fclose")));

 

Thanks.

 

Colin

 


Re: missing protoc-gen-c

Zhang, Jiafu
 

Hi,

 

Just sent my webchat account to your email.

 

Thanks.

 

From: daos@daos.groups.io <daos@daos.groups.io> On Behalf Of timehuang88@...
Sent: Wednesday, June 3, 2020 9:33 AM
To: daos@daos.groups.io
Subject: Re: [daos] missing protoc-gen-c

 

Hi Zhang, Jiafu

May 7   

could I have your wechat account, so that we can discuss on DAOS conveniently.


Re: missing protoc-gen-c

timehuang88@...
 

Hi Zhang, Jiafu
May 7   
could I have your wechat account, so that we can discuss on DAOS conveniently.


Re: daos_agent starts running in directory ./daos/build/dev/gcc/src/cart/src/gurt

Macdonald, Mjmac
 

Hi Kevan.

 

We haven’t built anything into daos_agent to daemonize itself – the expectation is that in production that sort of thing will be handled by systemd. Maybe you could use a wrapper to cd / and then exec daos_agent ?

 

mjmac

 

From: daos@daos.groups.io <daos@daos.groups.io> On Behalf Of Kevan Rehm
Sent: Wednesday, 27 May, 2020 17:54
To: daos@daos.groups.io
Subject: [daos] daos_agent starts running in directory ./daos/build/dev/gcc/src/cart/src/gurt

 

Greetings,

 

Checking to see if this is a known issue.    We build DAOS in a NFS filesystem, as then we can log into any node in the cluster and use gdb with the source tree to debug a binary.   The binaries get installed in ~daos/daos/install, which is a local filesystem on each node.

 

What happens is that when we start the daos_agent on the node where we compiled DAOS, it ends up cd’d in directory daos/build/dev/gcc/src/cart/src/gurt in the NFS filesystem, so if we want to delete the daos source tree and start over, we can’t do so because there is a .nfs0000XXX file in that directory which prevents a “rm -rf daos” from succeeding.   We have to kill off the daos_agent in order to be able to delete the daos directory, then start the daos_agent again.

 

I suspect this is not intentional.  😊.  Known problem, or should I open a ticket?

 

Thanks, Kevan


daos_agent starts running in directory ./daos/build/dev/gcc/src/cart/src/gurt

Kevan Rehm
 

Greetings,

 

Checking to see if this is a known issue.    We build DAOS in a NFS filesystem, as then we can log into any node in the cluster and use gdb with the source tree to debug a binary.   The binaries get installed in ~daos/daos/install, which is a local filesystem on each node.

 

What happens is that when we start the daos_agent on the node where we compiled DAOS, it ends up cd’d in directory daos/build/dev/gcc/src/cart/src/gurt in the NFS filesystem, so if we want to delete the daos source tree and start over, we can’t do so because there is a .nfs0000XXX file in that directory which prevents a “rm -rf daos” from succeeding.   We have to kill off the daos_agent in order to be able to delete the daos directory, then start the daos_agent again.

 

I suspect this is not intentional.  😊.  Known problem, or should I open a ticket?

 

Thanks, Kevan


Re: problem: support Spark when compile java files

Zhang, Jiafu
 

Yes. It’s correct.

 

From: daos@daos.groups.io <daos@daos.groups.io> On Behalf Of timehuang88@...
Sent: Tuesday, May 26, 2020 3:53 PM
To: daos@daos.groups.io
Subject: Re: [daos] problem: support Spark when compile java files

 

Hi Jiafu,
thx for your reply. 
since I build and install daos with the cmd below:

 scons --config=force --build-deps=yes install  PREFIX=/usr

so am i right to build java files like this cmd below:

$ mvn clean package -DskipITs -Ddaos.install.path=/usr


Small DAOS for CI

Chuck Atkins <chuck.atkins@...>
 

I'd like to start working on integrating the DAOS API and I'd like to have it tested in public CI infrastructure.  How feasible would it be to have a micro-daos server as a container that could be spun up for CI jobs so API unit tests could be run against it?


Re: daos_agent default values

Kevan Rehm
 

Yes that is the problem.  I have the bug commit but not the fix commit.   Can’t move forward at the moment, chasing another issue.

 

Thanks for the help,

 

Kevan

 

From: <daos@daos.groups.io> on behalf of "Macdonald, Mjmac" <mjmac.macdonald@...>
Reply-To: "daos@daos.groups.io" <daos@daos.groups.io>
Date: Tuesday, May 26, 2020 at 10:59 AM
To: "daos@daos.groups.io" <daos@daos.groups.io>
Subject: Re: [daos] daos_agent default values

 

Hi Kevan.

 

What is the commit you’re working with? There was an agent config bug introduced in 2836ae4 and fixed in 12dfe6b.

 

Best,

mjmac

 

From: daos@daos.groups.io <daos@daos.groups.io> On Behalf Of Kevan Rehm
Sent: Tuesday, 26 May, 2020 11:01
To: daos@daos.groups.io
Subject: [daos] daos_agent default values

 

Greetings,

 

In file utils/config/daos_agent.yml the comments say that the default value for runtime_dir is /var/run/daos_agent and the default value for log_file is /tmp/daos_agent.log.   I don’t think this is true.   Given the text, I left the lines commented out in my latest config assuming I would get the defaults, but found that the daos_agent created its agent.sock file in ~daos, and it did not create a log file.   When I un-comment the two lines, the agent works as expected.

 

Seems like either the comments in that file should be changed, or daos_agent should honor those defaults.

 

Thanks, Kevan


Re: daos_agent default values

Macdonald, Mjmac
 

Hi Kevan.

 

What is the commit you’re working with? There was an agent config bug introduced in 2836ae4 and fixed in 12dfe6b.

 

Best,

mjmac

 

From: daos@daos.groups.io <daos@daos.groups.io> On Behalf Of Kevan Rehm
Sent: Tuesday, 26 May, 2020 11:01
To: daos@daos.groups.io
Subject: [daos] daos_agent default values

 

Greetings,

 

In file utils/config/daos_agent.yml the comments say that the default value for runtime_dir is /var/run/daos_agent and the default value for log_file is /tmp/daos_agent.log.   I don’t think this is true.   Given the text, I left the lines commented out in my latest config assuming I would get the defaults, but found that the daos_agent created its agent.sock file in ~daos, and it did not create a log file.   When I un-comment the two lines, the agent works as expected.

 

Seems like either the comments in that file should be changed, or daos_agent should honor those defaults.

 

Thanks, Kevan


daos_agent default values

Kevan Rehm
 

Greetings,

 

In file utils/config/daos_agent.yml the comments say that the default value for runtime_dir is /var/run/daos_agent and the default value for log_file is /tmp/daos_agent.log.   I don’t think this is true.   Given the text, I left the lines commented out in my latest config assuming I would get the defaults, but found that the daos_agent created its agent.sock file in ~daos, and it did not create a log file.   When I un-comment the two lines, the agent works as expected.

 

Seems like either the comments in that file should be changed, or daos_agent should honor those defaults.

 

Thanks, Kevan


Re: problem: support Spark when compile java files

timehuang88@...
 

Hi Jiafu,
thx for your reply. 
since I build and install daos with the cmd below:
 scons --config=force --build-deps=yes install  PREFIX=/usr
so am i right to build java files like this cmd below:
$ mvn clean package -DskipITs -Ddaos.install.path=/usr


Re: problem: support Spark when compile java files

Zhang, Jiafu
 

Hi,

 

Do you have more error message? I cannot infer what cause is from your debug message. Just to make sure replace <daos_install> with your real path. If daos is installed to /usr/local/daos, your mvn command should be,

 

mvn clean package -DskipITs -Ddaos.install.path=/usr/local/daos

 

Thanks.

 

From: daos@daos.groups.io <daos@daos.groups.io> On Behalf Of timehuang88@...
Sent: Monday, May 25, 2020 8:23 PM
To: daos@daos.groups.io
Subject: [daos] problem: support Spark when compile java files

 

Hi guys:
when i run cmd: $ mvn clean package -DskipITs -Ddaos.install.path=<daos_install>

 

the debug log show the info below and compile process can not go on, finally no  daos-java-<version>-assemble.tgz generated

 

[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[plexus.core, parent: null]

[DEBUG] Extension realms for project io.daos:hadoop-daos:jar:1.1.0-SNAPSHOT: (none)

[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[plexus.core, parent: null]

[DEBUG] Extension realms for project io.daos:distribution:pom:1.1.0-SNAPSHOT: (none)

[DEBUG] Looking up lifecyle mappings for packaging pom from ClassRealm[plexus.core, parent: null]

[ERROR] The build could not read 1 project -> [Help 1]

 


problem: support Spark when compile java files

timehuang88@...
 

Hi guys:
when i run cmd: $ mvn clean package -DskipITs -Ddaos.install.path=<daos_install>

 

the debug log show the info below and compile process can not go on, finally no  daos-java-<version>-assemble.tgz generated


[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[plexus.core, parent: null]
[DEBUG] Extension realms for project io.daos:hadoop-daos:jar:1.1.0-SNAPSHOT: (none)
[DEBUG] Looking up lifecyle mappings for packaging jar from ClassRealm[plexus.core, parent: null]
[DEBUG] Extension realms for project io.daos:distribution:pom:1.1.0-SNAPSHOT: (none)
[DEBUG] Looking up lifecyle mappings for packaging pom from ClassRealm[plexus.core, parent: null]
[ERROR] The build could not read 1 project -> [Help 1]
 


Re: dfuse can't access /var/run/daos_agent/agent.sock

Ruben Felgenhauer <4felgenh@...>
 

Thanks for the quick reply. I did indeed forget to do that. It works fine now.

Kind regards
Ruben


Re: dfuse can't access /var/run/daos_agent/agent.sock

Farrell, Patrick Arthur <patrick.farrell@...>
 

Do you have the daos_agent running?  You didn't mention that aspect - it must be running on any clients.  There are more details in the manual.

Regards,
-Patrick

From: daos@daos.groups.io <daos@daos.groups.io> on behalf of Ruben Felgenhauer <4felgenh@...>
Sent: Friday, May 22, 2020 11:47 AM
To: daos@daos.groups.io <daos@daos.groups.io>
Subject: [daos] dfuse can't access /var/run/daos_agent/agent.sock
 
I am using the config file from daos/utils/config/examples/daos_server_local.yml with the only difference that I'm not using the bdev lines and with an scm_size of 50 GB.

 I'd like to test daos' fuse fileystem, but sadly it crashes directly on startup. I tried the following two lines:

$ OFI_INTERFACE=eth0 dfuse -S --mountpoint="$DFS_MNT" --svc="$DAOS_SVCL" --pool="$DAOS_POOL" --foreground
$ OFI_INTERFACE=eth0 dfuse_hl "$DFS_MNT" -s -f -d -p "$DAOS_POOL" -l "$DAOS_SVCL"
Pool Connect...
DFS Pool = 5092de74-348c-4c99-8c19-875c14523f86
DFS SVCL = 0
Failed to connect to pool (-1026)

Note that the first one crashes without any error message. In both cases, the /tmp/daos.log reads:

05/22-18:41:26.80 abu2 DAOS[64779/64779] daos ERR  src/common/drpc.c:173 unixcomm_connect() Failed to connect to /var/run/daos_agent/agent.sock,              errno=2(No such file or directory)
05/22-18:41:26.80 abu2 DAOS[64779/64779] mgmt ERR  src/mgmt/cli_mgmt.c:296 get_attach_info() failed to connect to /var/run/daos_agent/agent.sock

Does DFS not work with daos_server_local.yml? Setting the socket_dir variable in the config file doesn't seem to help either.

Kind regards
Ruben


dfuse can't access /var/run/daos_agent/agent.sock

Ruben Felgenhauer <4felgenh@...>
 

I am using the config file from daos/utils/config/examples/daos_server_local.yml with the only difference that I'm not using the bdev lines and with an scm_size of 50 GB.

 I'd like to test daos' fuse fileystem, but sadly it crashes directly on startup. I tried the following two lines:

$ OFI_INTERFACE=eth0 dfuse -S --mountpoint="$DFS_MNT" --svc="$DAOS_SVCL" --pool="$DAOS_POOL" --foreground
$ OFI_INTERFACE=eth0 dfuse_hl "$DFS_MNT" -s -f -d -p "$DAOS_POOL" -l "$DAOS_SVCL"
Pool Connect...
DFS Pool = 5092de74-348c-4c99-8c19-875c14523f86
DFS SVCL = 0
Failed to connect to pool (-1026)

Note that the first one crashes without any error message. In both cases, the /tmp/daos.log reads:

05/22-18:41:26.80 abu2 DAOS[64779/64779] daos ERR  src/common/drpc.c:173 unixcomm_connect() Failed to connect to /var/run/daos_agent/agent.sock,              errno=2(No such file or directory)
05/22-18:41:26.80 abu2 DAOS[64779/64779] mgmt ERR  src/mgmt/cli_mgmt.c:296 get_attach_info() failed to connect to /var/run/daos_agent/agent.sock

Does DFS not work with daos_server_local.yml? Setting the socket_dir variable in the config file doesn't seem to help either.

Kind regards
Ruben


Ubuntu 20.04 (LTS)

Lombardi, Johann
 

Hi there,

 

Just a heads up that master has moved from 18.04 to 20.04 for ubuntu support.

 

Cheers,

Johann

---------------------------------------------------------------------
Intel Corporation SAS (French simplified joint stock company)
Registered headquarters: "Les Montalets"- 2, rue de Paris,
92196 Meudon Cedex, France
Registration Number:  302 456 199 R.C.S. NANTERRE
Capital: 4,572,000 Euros

This e-mail and any attachments may contain confidential material for
the sole use of the intended recipient(s). Any review or distribution
by others is strictly prohibited. If you are not the intended
recipient, please contact the sender and delete all copies.